sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
sequencelengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
sequencelengths 0
25
| languages
sequencelengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
sequencelengths 0
352
| processed_texts
sequencelengths 1
353
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
7b4522928061cf773aafb1e763bf20b851bc9477 | ## Description
14,980 Images PPT OCR Data of 8 Languages. This dataset includes 8 languages, multiple scenes, different photographic angles, different photographic distances, different light conditions. For annotation, line-level quadrilateral bounding box annotation and transcription for the texts were annotated in the data. The dataset can be used for tasks such as OCR of multi-language.
For more details, please refer to the link: https://www.nexdata.ai/datasets/979?source=Huggingface
## Data size
14,980 images, 8 languages
## Data environment
including meeting room, conference room
## Language types
French, Korean, Japanese, Spanish, German, Italian, Portuguese and Russian
## Data diversity
multiple scenes, multiple languages, different photographic angles, different photographic distances, different light conditions
## Device
cellphone
## Collecting angles
front, left, right, looking up angle
## Data format
the image data format is .jpg, the annotation file data format is .json
## Annotation content
line-level quadrilateral bounding box annotation and transcription for the texts
## Accuracy
the error bound of each vertex of quadrilateral bounding box is within 5 pixels, which is a qualified annotation, the accuracy of bounding boxes is not less than 95%; the texts transcription accuracy is not less than 95%
# Licensing Information
Commercial License | Nexdata/PPT_OCR_Data_of_8_Languages | [
"license:cc-by-nc-4.0",
"region:us"
] | 2024-02-04T09:28:23+00:00 | {"license": "cc-by-nc-4.0"} | 2024-02-04T10:05:15+00:00 | [] | [] | TAGS
#license-cc-by-nc-4.0 #region-us
| ## Description
14,980 Images PPT OCR Data of 8 Languages. This dataset includes 8 languages, multiple scenes, different photographic angles, different photographic distances, different light conditions. For annotation, line-level quadrilateral bounding box annotation and transcription for the texts were annotated in the data. The dataset can be used for tasks such as OCR of multi-language.
For more details, please refer to the link: URL
## Data size
14,980 images, 8 languages
## Data environment
including meeting room, conference room
## Language types
French, Korean, Japanese, Spanish, German, Italian, Portuguese and Russian
## Data diversity
multiple scenes, multiple languages, different photographic angles, different photographic distances, different light conditions
## Device
cellphone
## Collecting angles
front, left, right, looking up angle
## Data format
the image data format is .jpg, the annotation file data format is .json
## Annotation content
line-level quadrilateral bounding box annotation and transcription for the texts
## Accuracy
the error bound of each vertex of quadrilateral bounding box is within 5 pixels, which is a qualified annotation, the accuracy of bounding boxes is not less than 95%; the texts transcription accuracy is not less than 95%
# Licensing Information
Commercial License | [
"## Description\n14,980 Images PPT OCR Data of 8 Languages. This dataset includes 8 languages, multiple scenes, different photographic angles, different photographic distances, different light conditions. For annotation, line-level quadrilateral bounding box annotation and transcription for the texts were annotated in the data. The dataset can be used for tasks such as OCR of multi-language.\n\nFor more details, please refer to the link: URL",
"## Data size\n14,980 images, 8 languages",
"## Data environment\nincluding meeting room, conference room",
"## Language types\nFrench, Korean, Japanese, Spanish, German, Italian, Portuguese and Russian",
"## Data diversity\nmultiple scenes, multiple languages, different photographic angles, different photographic distances, different light conditions",
"## Device\ncellphone",
"## Collecting angles\nfront, left, right, looking up angle",
"## Data format\nthe image data format is .jpg, the annotation file data format is .json",
"## Annotation content\nline-level quadrilateral bounding box annotation and transcription for the texts",
"## Accuracy\nthe error bound of each vertex of quadrilateral bounding box is within 5 pixels, which is a qualified annotation, the accuracy of bounding boxes is not less than 95%; the texts transcription accuracy is not less than 95%",
"# Licensing Information\nCommercial License"
] | [
"TAGS\n#license-cc-by-nc-4.0 #region-us \n",
"## Description\n14,980 Images PPT OCR Data of 8 Languages. This dataset includes 8 languages, multiple scenes, different photographic angles, different photographic distances, different light conditions. For annotation, line-level quadrilateral bounding box annotation and transcription for the texts were annotated in the data. The dataset can be used for tasks such as OCR of multi-language.\n\nFor more details, please refer to the link: URL",
"## Data size\n14,980 images, 8 languages",
"## Data environment\nincluding meeting room, conference room",
"## Language types\nFrench, Korean, Japanese, Spanish, German, Italian, Portuguese and Russian",
"## Data diversity\nmultiple scenes, multiple languages, different photographic angles, different photographic distances, different light conditions",
"## Device\ncellphone",
"## Collecting angles\nfront, left, right, looking up angle",
"## Data format\nthe image data format is .jpg, the annotation file data format is .json",
"## Annotation content\nline-level quadrilateral bounding box annotation and transcription for the texts",
"## Accuracy\nthe error bound of each vertex of quadrilateral bounding box is within 5 pixels, which is a qualified annotation, the accuracy of bounding boxes is not less than 95%; the texts transcription accuracy is not less than 95%",
"# Licensing Information\nCommercial License"
] |
d38953ba7676c13aaa40ef2813fd971e5767ae94 |
# Dataset Card for Evaluation run of Kquant03/Cognito-2x7B-bf16
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kquant03/Cognito-2x7B-bf16](https://huggingface.co/Kquant03/Cognito-2x7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kquant03__Cognito-2x7B-bf16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T09:26:22.410222](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Cognito-2x7B-bf16/blob/main/results_2024-02-04T09-26-22.410222.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6550063357391831,
"acc_stderr": 0.0320338761154218,
"acc_norm": 0.6540487782518486,
"acc_norm_stderr": 0.03271120506441515,
"mc1": 0.5740514075887393,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.7169537206661909,
"mc2_stderr": 0.01479448042123873
},
"harness|arc:challenge|25": {
"acc": 0.7081911262798635,
"acc_stderr": 0.013284525292403511,
"acc_norm": 0.7295221843003413,
"acc_norm_stderr": 0.012980954547659556
},
"harness|hellaswag|10": {
"acc": 0.717486556462856,
"acc_stderr": 0.004493015945599716,
"acc_norm": 0.8895638319059949,
"acc_norm_stderr": 0.003127920738394107
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188716,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188716
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.0274796030105388,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.0274796030105388
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131157,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131157
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163248,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163248
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993466,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993466
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.44581005586592176,
"acc_stderr": 0.016623998513333106,
"acc_norm": 0.44581005586592176,
"acc_norm_stderr": 0.016623998513333106
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042103,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47522816166883963,
"acc_stderr": 0.012754553719781753,
"acc_norm": 0.47522816166883963,
"acc_norm_stderr": 0.012754553719781753
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5740514075887393,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.7169537206661909,
"mc2_stderr": 0.01479448042123873
},
"harness|winogrande|5": {
"acc": 0.856353591160221,
"acc_stderr": 0.009857280052696735
},
"harness|gsm8k|5": {
"acc": 0.7035633055344959,
"acc_stderr": 0.012579398235589526
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Kquant03__Cognito-2x7B-bf16 | [
"region:us"
] | 2024-02-04T09:28:39+00:00 | {"pretty_name": "Evaluation run of Kquant03/Cognito-2x7B-bf16", "dataset_summary": "Dataset automatically created during the evaluation run of model [Kquant03/Cognito-2x7B-bf16](https://huggingface.co/Kquant03/Cognito-2x7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__Cognito-2x7B-bf16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T09:26:22.410222](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Cognito-2x7B-bf16/blob/main/results_2024-02-04T09-26-22.410222.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6550063357391831,\n \"acc_stderr\": 0.0320338761154218,\n \"acc_norm\": 0.6540487782518486,\n \"acc_norm_stderr\": 0.03271120506441515,\n \"mc1\": 0.5740514075887393,\n \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.7169537206661909,\n \"mc2_stderr\": 0.01479448042123873\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7081911262798635,\n \"acc_stderr\": 0.013284525292403511,\n \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.717486556462856,\n \"acc_stderr\": 0.004493015945599716,\n \"acc_norm\": 0.8895638319059949,\n \"acc_norm_stderr\": 0.003127920738394107\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188716,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188716\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.0274796030105388,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.0274796030105388\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131157,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131157\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163248,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163248\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993466,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993466\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44581005586592176,\n \"acc_stderr\": 0.016623998513333106,\n \"acc_norm\": 0.44581005586592176,\n \"acc_norm_stderr\": 0.016623998513333106\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042103,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042103\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47522816166883963,\n \"acc_stderr\": 0.012754553719781753,\n \"acc_norm\": 0.47522816166883963,\n \"acc_norm_stderr\": 0.012754553719781753\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5740514075887393,\n \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.7169537206661909,\n \"mc2_stderr\": 0.01479448042123873\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.856353591160221,\n \"acc_stderr\": 0.009857280052696735\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7035633055344959,\n \"acc_stderr\": 0.012579398235589526\n }\n}\n```", "repo_url": "https://huggingface.co/Kquant03/Cognito-2x7B-bf16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|arc:challenge|25_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|gsm8k|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hellaswag|10_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T09-26-22.410222.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["**/details_harness|winogrande|5_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T09-26-22.410222.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T09_26_22.410222", "path": ["results_2024-02-04T09-26-22.410222.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T09-26-22.410222.parquet"]}]}]} | 2024-02-04T09:29:05+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Kquant03/Cognito-2x7B-bf16
Dataset automatically created during the evaluation run of model Kquant03/Cognito-2x7B-bf16 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T09:26:22.410222(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Kquant03/Cognito-2x7B-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Cognito-2x7B-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T09:26:22.410222(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Kquant03/Cognito-2x7B-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Cognito-2x7B-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T09:26:22.410222(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
a06891b0e598f98d0343fab0cf058f54722008f0 | ## Description
46,695 Images- Household Waste Data.The data includes multiple types of waste, multiple scenes. This data set can be used for tasks such as object detection, classification, and recognition.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1030?source=Huggingface
## Date size
46,695 images, 46,695 bounding boxes
## Collecting environment
including indoor and outdoor scenes
## Data diversity
multiple types of waste, multiple scenes
## Device
cellphone
## Data format
the image data formats are .jpg, png and jpeg , the annotation file format is .json
## Annotation content
rectangular bounding boxes of waste
## Accuracy
the accuracy of rectangular bounding boxes is not less than 97%
# Licensing Information
Commercial License
| Nexdata/Household_Waste_Data | [
"license:cc-by-nc-4.0",
"region:us"
] | 2024-02-04T09:32:12+00:00 | {"license": "cc-by-nc-4.0"} | 2024-02-04T10:06:48+00:00 | [] | [] | TAGS
#license-cc-by-nc-4.0 #region-us
| ## Description
46,695 Images- Household Waste Data.The data includes multiple types of waste, multiple scenes. This data set can be used for tasks such as object detection, classification, and recognition.
For more details, please refer to the link: URL
## Date size
46,695 images, 46,695 bounding boxes
## Collecting environment
including indoor and outdoor scenes
## Data diversity
multiple types of waste, multiple scenes
## Device
cellphone
## Data format
the image data formats are .jpg, png and jpeg , the annotation file format is .json
## Annotation content
rectangular bounding boxes of waste
## Accuracy
the accuracy of rectangular bounding boxes is not less than 97%
# Licensing Information
Commercial License
| [
"## Description\n46,695 Images- Household Waste Data.The data includes multiple types of waste, multiple scenes. This data set can be used for tasks such as object detection, classification, and recognition.\n\nFor more details, please refer to the link: URL",
"## Date size\n46,695 images, 46,695 bounding boxes",
"## Collecting environment\nincluding indoor and outdoor scenes",
"## Data diversity\nmultiple types of waste, multiple scenes",
"## Device\ncellphone",
"## Data format\nthe image data formats are .jpg, png and jpeg , the annotation file format is .json",
"## Annotation content\nrectangular bounding boxes of waste",
"## Accuracy\nthe accuracy of rectangular bounding boxes is not less than 97%",
"# Licensing Information\nCommercial License"
] | [
"TAGS\n#license-cc-by-nc-4.0 #region-us \n",
"## Description\n46,695 Images- Household Waste Data.The data includes multiple types of waste, multiple scenes. This data set can be used for tasks such as object detection, classification, and recognition.\n\nFor more details, please refer to the link: URL",
"## Date size\n46,695 images, 46,695 bounding boxes",
"## Collecting environment\nincluding indoor and outdoor scenes",
"## Data diversity\nmultiple types of waste, multiple scenes",
"## Device\ncellphone",
"## Data format\nthe image data formats are .jpg, png and jpeg , the annotation file format is .json",
"## Annotation content\nrectangular bounding boxes of waste",
"## Accuracy\nthe accuracy of rectangular bounding boxes is not less than 97%",
"# Licensing Information\nCommercial License"
] |
d0d641f9b85bc54c8ca22b62ee5ea2e64f925fb0 |
# Dataset Card for Evaluation run of alchemonaut/QuartetAnemoi-70B-t0.0001
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alchemonaut/QuartetAnemoi-70B-t0.0001](https://huggingface.co/alchemonaut/QuartetAnemoi-70B-t0.0001) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alchemonaut__QuartetAnemoi-70B-t0.0001",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T09:33:24.428024](https://huggingface.co/datasets/open-llm-leaderboard/details_alchemonaut__QuartetAnemoi-70B-t0.0001/blob/main/results_2024-02-04T09-33-24.428024.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7529733519925265,
"acc_stderr": 0.028560689453846086,
"acc_norm": 0.7560953011110824,
"acc_norm_stderr": 0.029110491550081063,
"mc1": 0.5373317013463892,
"mc1_stderr": 0.017454645150970588,
"mc2": 0.6953224067002621,
"mc2_stderr": 0.014718923922056508
},
"harness|arc:challenge|25": {
"acc": 0.6919795221843004,
"acc_stderr": 0.013491429517292038,
"acc_norm": 0.7337883959044369,
"acc_norm_stderr": 0.012915774781523214
},
"harness|hellaswag|10": {
"acc": 0.7132045409281019,
"acc_stderr": 0.004513409114983832,
"acc_norm": 0.8889663413662617,
"acc_norm_stderr": 0.0031353173122281234
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6962962962962963,
"acc_stderr": 0.03972552884785137,
"acc_norm": 0.6962962962962963,
"acc_norm_stderr": 0.03972552884785137
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8355263157894737,
"acc_stderr": 0.03016753346863271,
"acc_norm": 0.8355263157894737,
"acc_norm_stderr": 0.03016753346863271
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8,
"acc_stderr": 0.024618298195866514,
"acc_norm": 0.8,
"acc_norm_stderr": 0.024618298195866514
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.875,
"acc_stderr": 0.02765610492929436,
"acc_norm": 0.875,
"acc_norm_stderr": 0.02765610492929436
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.04959859966384181,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.04959859966384181
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7489361702127659,
"acc_stderr": 0.02834696377716245,
"acc_norm": 0.7489361702127659,
"acc_norm_stderr": 0.02834696377716245
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7379310344827587,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.7379310344827587,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.025680564640056882,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.025680564640056882
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8741935483870967,
"acc_stderr": 0.01886583428803001,
"acc_norm": 0.8741935483870967,
"acc_norm_stderr": 0.01886583428803001
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.033442837442804574,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.033442837442804574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781675,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781675
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.898989898989899,
"acc_stderr": 0.021469735576055332,
"acc_norm": 0.898989898989899,
"acc_norm_stderr": 0.021469735576055332
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.016731085293607558,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.016731085293607558
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7794871794871795,
"acc_stderr": 0.0210206726808279,
"acc_norm": 0.7794871794871795,
"acc_norm_stderr": 0.0210206726808279
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.030039842454069283,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.030039842454069283
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8697478991596639,
"acc_stderr": 0.021863258494852128,
"acc_norm": 0.8697478991596639,
"acc_norm_stderr": 0.021863258494852128
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248437,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248437
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9211009174311927,
"acc_stderr": 0.011558198113769553,
"acc_norm": 0.9211009174311927,
"acc_norm_stderr": 0.011558198113769553
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.030851992993257013,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.030851992993257013
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9156118143459916,
"acc_stderr": 0.018094247116473335,
"acc_norm": 0.9156118143459916,
"acc_norm_stderr": 0.018094247116473335
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.02624113299640726,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.02624113299640726
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9256198347107438,
"acc_stderr": 0.02395268883667674,
"acc_norm": 0.9256198347107438,
"acc_norm_stderr": 0.02395268883667674
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243630999,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243630999
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971723,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971723
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6607142857142857,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.6607142857142857,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808629,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808629
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.018315891685625845,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.018315891685625845
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8978288633461047,
"acc_stderr": 0.01083072471313418,
"acc_norm": 0.8978288633461047,
"acc_norm_stderr": 0.01083072471313418
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8352601156069365,
"acc_stderr": 0.019971040982442272,
"acc_norm": 0.8352601156069365,
"acc_norm_stderr": 0.019971040982442272
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.646927374301676,
"acc_stderr": 0.015984204545268575,
"acc_norm": 0.646927374301676,
"acc_norm_stderr": 0.015984204545268575
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.826797385620915,
"acc_stderr": 0.021668400256514307,
"acc_norm": 0.826797385620915,
"acc_norm_stderr": 0.021668400256514307
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8263665594855305,
"acc_stderr": 0.021514051585970393,
"acc_norm": 0.8263665594855305,
"acc_norm_stderr": 0.021514051585970393
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.019242526226544533,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.019242526226544533
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6099290780141844,
"acc_stderr": 0.02909767559946393,
"acc_norm": 0.6099290780141844,
"acc_norm_stderr": 0.02909767559946393
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5912646675358539,
"acc_stderr": 0.012555701346703387,
"acc_norm": 0.5912646675358539,
"acc_norm_stderr": 0.012555701346703387
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8125,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.015076937921915376,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.015076937921915376
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8285714285714286,
"acc_stderr": 0.024127463462650156,
"acc_norm": 0.8285714285714286,
"acc_norm_stderr": 0.024127463462650156
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9253731343283582,
"acc_stderr": 0.018581939698490618,
"acc_norm": 0.9253731343283582,
"acc_norm_stderr": 0.018581939698490618
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.95,
"acc_stderr": 0.021904291355759057,
"acc_norm": 0.95,
"acc_norm_stderr": 0.021904291355759057
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5373317013463892,
"mc1_stderr": 0.017454645150970588,
"mc2": 0.6953224067002621,
"mc2_stderr": 0.014718923922056508
},
"harness|winogrande|5": {
"acc": 0.8531965272296764,
"acc_stderr": 0.009946627440250693
},
"harness|gsm8k|5": {
"acc": 0.686125852918878,
"acc_stderr": 0.012782681251053191
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_alchemonaut__QuartetAnemoi-70B-t0.0001 | [
"region:us"
] | 2024-02-04T09:35:49+00:00 | {"pretty_name": "Evaluation run of alchemonaut/QuartetAnemoi-70B-t0.0001", "dataset_summary": "Dataset automatically created during the evaluation run of model [alchemonaut/QuartetAnemoi-70B-t0.0001](https://huggingface.co/alchemonaut/QuartetAnemoi-70B-t0.0001) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alchemonaut__QuartetAnemoi-70B-t0.0001\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T09:33:24.428024](https://huggingface.co/datasets/open-llm-leaderboard/details_alchemonaut__QuartetAnemoi-70B-t0.0001/blob/main/results_2024-02-04T09-33-24.428024.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7529733519925265,\n \"acc_stderr\": 0.028560689453846086,\n \"acc_norm\": 0.7560953011110824,\n \"acc_norm_stderr\": 0.029110491550081063,\n \"mc1\": 0.5373317013463892,\n \"mc1_stderr\": 0.017454645150970588,\n \"mc2\": 0.6953224067002621,\n \"mc2_stderr\": 0.014718923922056508\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6919795221843004,\n \"acc_stderr\": 0.013491429517292038,\n \"acc_norm\": 0.7337883959044369,\n \"acc_norm_stderr\": 0.012915774781523214\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7132045409281019,\n \"acc_stderr\": 0.004513409114983832,\n \"acc_norm\": 0.8889663413662617,\n \"acc_norm_stderr\": 0.0031353173122281234\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6962962962962963,\n \"acc_stderr\": 0.03972552884785137,\n \"acc_norm\": 0.6962962962962963,\n \"acc_norm_stderr\": 0.03972552884785137\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8355263157894737,\n \"acc_stderr\": 0.03016753346863271,\n \"acc_norm\": 0.8355263157894737,\n \"acc_norm_stderr\": 0.03016753346863271\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.024618298195866514,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.024618298195866514\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.02765610492929436,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.02765610492929436\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.04959859966384181,\n \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.04959859966384181\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7489361702127659,\n \"acc_stderr\": 0.02834696377716245,\n \"acc_norm\": 0.7489361702127659,\n \"acc_norm_stderr\": 0.02834696377716245\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.5877192982456141,\n \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7379310344827587,\n \"acc_stderr\": 0.036646663372252565,\n \"acc_norm\": 0.7379310344827587,\n \"acc_norm_stderr\": 0.036646663372252565\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.025680564640056882,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.025680564640056882\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8741935483870967,\n \"acc_stderr\": 0.01886583428803001,\n \"acc_norm\": 0.8741935483870967,\n \"acc_norm_stderr\": 0.01886583428803001\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.033442837442804574,\n \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.033442837442804574\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781675,\n \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781675\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.898989898989899,\n \"acc_stderr\": 0.021469735576055332,\n \"acc_norm\": 0.898989898989899,\n \"acc_norm_stderr\": 0.021469735576055332\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607558,\n \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607558\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7794871794871795,\n \"acc_stderr\": 0.0210206726808279,\n \"acc_norm\": 0.7794871794871795,\n \"acc_norm_stderr\": 0.0210206726808279\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4148148148148148,\n \"acc_stderr\": 0.030039842454069283,\n \"acc_norm\": 0.4148148148148148,\n \"acc_norm_stderr\": 0.030039842454069283\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8697478991596639,\n \"acc_stderr\": 0.021863258494852128,\n \"acc_norm\": 0.8697478991596639,\n \"acc_norm_stderr\": 0.021863258494852128\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248437,\n \"acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248437\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9211009174311927,\n \"acc_stderr\": 0.011558198113769553,\n \"acc_norm\": 0.9211009174311927,\n \"acc_norm_stderr\": 0.011558198113769553\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.030851992993257013,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.030851992993257013\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9156118143459916,\n \"acc_stderr\": 0.018094247116473335,\n \"acc_norm\": 0.9156118143459916,\n \"acc_norm_stderr\": 0.018094247116473335\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n \"acc_stderr\": 0.02624113299640726,\n \"acc_norm\": 0.8116591928251121,\n \"acc_norm_stderr\": 0.02624113299640726\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9256198347107438,\n \"acc_stderr\": 0.02395268883667674,\n \"acc_norm\": 0.9256198347107438,\n \"acc_norm_stderr\": 0.02395268883667674\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.03434300243630999,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.03434300243630999\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971723,\n \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971723\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6607142857142857,\n \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.6607142857142857,\n \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808629,\n \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808629\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n \"acc_stderr\": 0.018315891685625845,\n \"acc_norm\": 0.9145299145299145,\n \"acc_norm_stderr\": 0.018315891685625845\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8978288633461047,\n \"acc_stderr\": 0.01083072471313418,\n \"acc_norm\": 0.8978288633461047,\n \"acc_norm_stderr\": 0.01083072471313418\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8352601156069365,\n \"acc_stderr\": 0.019971040982442272,\n \"acc_norm\": 0.8352601156069365,\n \"acc_norm_stderr\": 0.019971040982442272\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.646927374301676,\n \"acc_stderr\": 0.015984204545268575,\n \"acc_norm\": 0.646927374301676,\n \"acc_norm_stderr\": 0.015984204545268575\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.826797385620915,\n \"acc_stderr\": 0.021668400256514307,\n \"acc_norm\": 0.826797385620915,\n \"acc_norm_stderr\": 0.021668400256514307\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8263665594855305,\n \"acc_stderr\": 0.021514051585970393,\n \"acc_norm\": 0.8263665594855305,\n \"acc_norm_stderr\": 0.021514051585970393\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.019242526226544533,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.019242526226544533\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6099290780141844,\n \"acc_stderr\": 0.02909767559946393,\n \"acc_norm\": 0.6099290780141844,\n \"acc_norm_stderr\": 0.02909767559946393\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5912646675358539,\n \"acc_stderr\": 0.012555701346703387,\n \"acc_norm\": 0.5912646675358539,\n \"acc_norm_stderr\": 0.012555701346703387\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.023709788253811766,\n \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.023709788253811766\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.015076937921915376,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.015076937921915376\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8285714285714286,\n \"acc_stderr\": 0.024127463462650156,\n \"acc_norm\": 0.8285714285714286,\n \"acc_norm_stderr\": 0.024127463462650156\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9253731343283582,\n \"acc_stderr\": 0.018581939698490618,\n \"acc_norm\": 0.9253731343283582,\n \"acc_norm_stderr\": 0.018581939698490618\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.95,\n \"acc_stderr\": 0.021904291355759057,\n \"acc_norm\": 0.95,\n \"acc_norm_stderr\": 0.021904291355759057\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5373317013463892,\n \"mc1_stderr\": 0.017454645150970588,\n \"mc2\": 0.6953224067002621,\n \"mc2_stderr\": 0.014718923922056508\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8531965272296764,\n \"acc_stderr\": 0.009946627440250693\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.686125852918878,\n \"acc_stderr\": 0.012782681251053191\n }\n}\n```", "repo_url": "https://huggingface.co/alchemonaut/QuartetAnemoi-70B-t0.0001", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|arc:challenge|25_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|gsm8k|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hellaswag|10_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T09-33-24.428024.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["**/details_harness|winogrande|5_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T09-33-24.428024.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T09_33_24.428024", "path": ["results_2024-02-04T09-33-24.428024.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T09-33-24.428024.parquet"]}]}]} | 2024-02-04T09:36:10+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of alchemonaut/QuartetAnemoi-70B-t0.0001
Dataset automatically created during the evaluation run of model alchemonaut/QuartetAnemoi-70B-t0.0001 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T09:33:24.428024(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of alchemonaut/QuartetAnemoi-70B-t0.0001\n\n\n\nDataset automatically created during the evaluation run of model alchemonaut/QuartetAnemoi-70B-t0.0001 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T09:33:24.428024(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of alchemonaut/QuartetAnemoi-70B-t0.0001\n\n\n\nDataset automatically created during the evaluation run of model alchemonaut/QuartetAnemoi-70B-t0.0001 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T09:33:24.428024(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
48ec88951f9292a3aae5c4d870148f07d601fd35 | ## Description
180,718 Images - Sign Language Gestures Recognition Data. The data diversity includes multiple scenes, 41 static gestures, 95 dynamic gestures, multiple photographic angles, and multiple light conditions. In terms of data annotation, 21 landmarks, gesture types, and gesture attributes were annotated. This dataset can be used for tasks such as gesture recognition and sign language translation.
For more details, please refer to the link: https://www.nexdata.ai/datasets/980?source=Huggingface
## Data size
180,718 images, including 83,013 images of static gestures, 97,705 images of dynamic gestures
## Population distribution
the race distribution is Asian, the gender distribution is male and female, the age distribution is mainly young people and middle-aged people
## Collection environment
including indoor scenes and outdoor scenes
## Collection diversity
including multiple scenes, 41 static gestures, 95 dynamic gestures, multiple photographic angles, multiple light conditions
## Device
cellphone
## Data forma
the image data format is .jpg, the annotation file format is .json
## Collecting content
sign language gestures were collected in different scenes
## Annotation content
21 landmarks annotation (each landmark includes the attribute of visible or invisible), gesture type annotation, gesture attributes annotation (left hand or right hand)
## Accuracy
accuracy requirement: the point location errors in x and y directions are less than 3 pixels, which is considered as a qualified annotation; accuracy of landmark annotation: the annotation part (each landmark) is regarded as the unit, the accuracy rate shall be more than 95%
# Licensing Information
Commercial License | Nexdata/Sign_Language_Gestures_Recognition_Data | [
"license:cc-by-nc-4.0",
"region:us"
] | 2024-02-04T09:36:38+00:00 | {"license": "cc-by-nc-4.0"} | 2024-02-04T10:09:17+00:00 | [] | [] | TAGS
#license-cc-by-nc-4.0 #region-us
| ## Description
180,718 Images - Sign Language Gestures Recognition Data. The data diversity includes multiple scenes, 41 static gestures, 95 dynamic gestures, multiple photographic angles, and multiple light conditions. In terms of data annotation, 21 landmarks, gesture types, and gesture attributes were annotated. This dataset can be used for tasks such as gesture recognition and sign language translation.
For more details, please refer to the link: URL
## Data size
180,718 images, including 83,013 images of static gestures, 97,705 images of dynamic gestures
## Population distribution
the race distribution is Asian, the gender distribution is male and female, the age distribution is mainly young people and middle-aged people
## Collection environment
including indoor scenes and outdoor scenes
## Collection diversity
including multiple scenes, 41 static gestures, 95 dynamic gestures, multiple photographic angles, multiple light conditions
## Device
cellphone
## Data forma
the image data format is .jpg, the annotation file format is .json
## Collecting content
sign language gestures were collected in different scenes
## Annotation content
21 landmarks annotation (each landmark includes the attribute of visible or invisible), gesture type annotation, gesture attributes annotation (left hand or right hand)
## Accuracy
accuracy requirement: the point location errors in x and y directions are less than 3 pixels, which is considered as a qualified annotation; accuracy of landmark annotation: the annotation part (each landmark) is regarded as the unit, the accuracy rate shall be more than 95%
# Licensing Information
Commercial License | [
"## Description\n180,718 Images - Sign Language Gestures Recognition Data. The data diversity includes multiple scenes, 41 static gestures, 95 dynamic gestures, multiple photographic angles, and multiple light conditions. In terms of data annotation, 21 landmarks, gesture types, and gesture attributes were annotated. This dataset can be used for tasks such as gesture recognition and sign language translation.\n\nFor more details, please refer to the link: URL",
"## Data size\n180,718 images, including 83,013 images of static gestures, 97,705 images of dynamic gestures",
"## Population distribution\nthe race distribution is Asian, the gender distribution is male and female, the age distribution is mainly young people and middle-aged people",
"## Collection environment\nincluding indoor scenes and outdoor scenes",
"## Collection diversity\nincluding multiple scenes, 41 static gestures, 95 dynamic gestures, multiple photographic angles, multiple light conditions",
"## Device\ncellphone",
"## Data forma\nthe image data format is .jpg, the annotation file format is .json",
"## Collecting content\nsign language gestures were collected in different scenes",
"## Annotation content\n21 landmarks annotation (each landmark includes the attribute of visible or invisible), gesture type annotation, gesture attributes annotation (left hand or right hand)",
"## Accuracy\naccuracy requirement: the point location errors in x and y directions are less than 3 pixels, which is considered as a qualified annotation; accuracy of landmark annotation: the annotation part (each landmark) is regarded as the unit, the accuracy rate shall be more than 95%",
"# Licensing Information\nCommercial License"
] | [
"TAGS\n#license-cc-by-nc-4.0 #region-us \n",
"## Description\n180,718 Images - Sign Language Gestures Recognition Data. The data diversity includes multiple scenes, 41 static gestures, 95 dynamic gestures, multiple photographic angles, and multiple light conditions. In terms of data annotation, 21 landmarks, gesture types, and gesture attributes were annotated. This dataset can be used for tasks such as gesture recognition and sign language translation.\n\nFor more details, please refer to the link: URL",
"## Data size\n180,718 images, including 83,013 images of static gestures, 97,705 images of dynamic gestures",
"## Population distribution\nthe race distribution is Asian, the gender distribution is male and female, the age distribution is mainly young people and middle-aged people",
"## Collection environment\nincluding indoor scenes and outdoor scenes",
"## Collection diversity\nincluding multiple scenes, 41 static gestures, 95 dynamic gestures, multiple photographic angles, multiple light conditions",
"## Device\ncellphone",
"## Data forma\nthe image data format is .jpg, the annotation file format is .json",
"## Collecting content\nsign language gestures were collected in different scenes",
"## Annotation content\n21 landmarks annotation (each landmark includes the attribute of visible or invisible), gesture type annotation, gesture attributes annotation (left hand or right hand)",
"## Accuracy\naccuracy requirement: the point location errors in x and y directions are less than 3 pixels, which is considered as a qualified annotation; accuracy of landmark annotation: the annotation part (each landmark) is regarded as the unit, the accuracy rate shall be more than 95%",
"# Licensing Information\nCommercial License"
] |
355dd1de41a39f09deb1002d01982a543508aa8f | # Dataset for "Discharge Me!"
Welcome to the dataset for "Discharge Me!", the BioNLP ACL'24 Shared Task on Streamlining Discharge Documentation.
For information about the shared task, please visit the [Codabench Challenge Website](https://www.codabench.org/competitions/1975/) or the [GitHub Page](https://stanford-aimi.github.io/discharge-me/).
The dataset for this shared task is based off the MIMIC-IV dataset. Before accessing and utilizing the provided data split, all participants are required to have signed the data access agreements on PhysioNet.
**By requesting access and using the provided Huggingface dataset split, the participants agree to the following:**
- Participants will comply with the *PhysioNet Credentialed Health Data License 1.5.0* pertaining to the use of the MIMIC-IV dataset and its modules (MIMIC-IV-Note and MIMIC-IV-ED).
- Participants will use the granted access solely for the purpose of participating in "Discharge Me!" at BioNLP ACL'24. Unauthorized use or distribution of the data is prohibited.
- Participants will ensure the confidentiality of the data and will not disclose any protected health information (PHI) if found.
- Participants will acknowledge the use of the MIMIC-IV dataset in any publications, presentations, or reports resulting from their work in "Discharge Me!".
We hope you enjoy the shared task!
-- Organizers of "Discharge Me!" @ BioNLP ACL'24 | justin13601/discharge-me | [
"task_categories:text-generation",
"size_categories:100K<n<1M",
"language:en",
"license:other",
"medical",
"region:us"
] | 2024-02-04T09:41:09+00:00 | {"language": ["en"], "license": "other", "size_categories": ["100K<n<1M"], "task_categories": ["text-generation"], "pretty_name": "Dataset for \"Discharge Me!\": Shared Task on Streamlining Discharge Documentation", "license_name": "physionet-credentialed-health-data-license-1.5.0", "license_link": "https://physionet.org/content/mimiciv/view-license/2.2/", "tags": ["medical"], "configs": [{"config_name": "1. Discharge Summary Sections (Targets)", "data_files": [{"split": "train", "path": "public/train/discharge_target.csv.gz"}, {"split": "valid", "path": "public/valid/discharge_target.csv.gz"}, {"split": "test", "path": "public/test/discharge_target.csv.gz"}]}, {"config_name": "2. Discharge Summaries", "data_files": [{"split": "train", "path": "public/train/discharge.csv.gz"}, {"split": "valid", "path": "public/valid/discharge.csv.gz"}, {"split": "test", "path": "public/test/discharge.csv.gz"}]}, {"config_name": "3. Radiology Reports", "data_files": [{"split": "train", "path": "public/train/radiology.csv.gz"}, {"split": "valid", "path": "public/valid/radiology.csv.gz"}, {"split": "test", "path": "public/test/radiology.csv.gz"}]}, {"config_name": "4. ED Stays", "data_files": [{"split": "train", "path": "public/train/edstays.csv.gz"}, {"split": "valid", "path": "public/valid/edstays.csv.gz"}, {"split": "test", "path": "public/test/edstays.csv.gz"}]}, {"config_name": "5. ED Triages", "data_files": [{"split": "train", "path": "public/train/triage.csv.gz"}, {"split": "valid", "path": "public/valid/triage.csv.gz"}, {"split": "test", "path": "public/test/triage.csv.gz"}]}, {"config_name": "6. ED Diagnoses", "data_files": [{"split": "train", "path": "public/train/diagnosis.csv.gz"}, {"split": "valid", "path": "public/valid/diagnosis.csv.gz"}, {"split": "test", "path": "public/test/diagnosis.csv.gz"}]}]} | 2024-02-06T07:34:20+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #size_categories-100K<n<1M #language-English #license-other #medical #region-us
| # Dataset for "Discharge Me!"
Welcome to the dataset for "Discharge Me!", the BioNLP ACL'24 Shared Task on Streamlining Discharge Documentation.
For information about the shared task, please visit the Codabench Challenge Website or the GitHub Page.
The dataset for this shared task is based off the MIMIC-IV dataset. Before accessing and utilizing the provided data split, all participants are required to have signed the data access agreements on PhysioNet.
By requesting access and using the provided Huggingface dataset split, the participants agree to the following:
- Participants will comply with the *PhysioNet Credentialed Health Data License 1.5.0* pertaining to the use of the MIMIC-IV dataset and its modules (MIMIC-IV-Note and MIMIC-IV-ED).
- Participants will use the granted access solely for the purpose of participating in "Discharge Me!" at BioNLP ACL'24. Unauthorized use or distribution of the data is prohibited.
- Participants will ensure the confidentiality of the data and will not disclose any protected health information (PHI) if found.
- Participants will acknowledge the use of the MIMIC-IV dataset in any publications, presentations, or reports resulting from their work in "Discharge Me!".
We hope you enjoy the shared task!
-- Organizers of "Discharge Me!" @ BioNLP ACL'24 | [
"# Dataset for \"Discharge Me!\"\n\nWelcome to the dataset for \"Discharge Me!\", the BioNLP ACL'24 Shared Task on Streamlining Discharge Documentation.\n\nFor information about the shared task, please visit the Codabench Challenge Website or the GitHub Page. \n\nThe dataset for this shared task is based off the MIMIC-IV dataset. Before accessing and utilizing the provided data split, all participants are required to have signed the data access agreements on PhysioNet.\n\nBy requesting access and using the provided Huggingface dataset split, the participants agree to the following:\n- Participants will comply with the *PhysioNet Credentialed Health Data License 1.5.0* pertaining to the use of the MIMIC-IV dataset and its modules (MIMIC-IV-Note and MIMIC-IV-ED).\n- Participants will use the granted access solely for the purpose of participating in \"Discharge Me!\" at BioNLP ACL'24. Unauthorized use or distribution of the data is prohibited.\n- Participants will ensure the confidentiality of the data and will not disclose any protected health information (PHI) if found.\n- Participants will acknowledge the use of the MIMIC-IV dataset in any publications, presentations, or reports resulting from their work in \"Discharge Me!\".\n\nWe hope you enjoy the shared task!\n\n-- Organizers of \"Discharge Me!\" @ BioNLP ACL'24"
] | [
"TAGS\n#task_categories-text-generation #size_categories-100K<n<1M #language-English #license-other #medical #region-us \n",
"# Dataset for \"Discharge Me!\"\n\nWelcome to the dataset for \"Discharge Me!\", the BioNLP ACL'24 Shared Task on Streamlining Discharge Documentation.\n\nFor information about the shared task, please visit the Codabench Challenge Website or the GitHub Page. \n\nThe dataset for this shared task is based off the MIMIC-IV dataset. Before accessing and utilizing the provided data split, all participants are required to have signed the data access agreements on PhysioNet.\n\nBy requesting access and using the provided Huggingface dataset split, the participants agree to the following:\n- Participants will comply with the *PhysioNet Credentialed Health Data License 1.5.0* pertaining to the use of the MIMIC-IV dataset and its modules (MIMIC-IV-Note and MIMIC-IV-ED).\n- Participants will use the granted access solely for the purpose of participating in \"Discharge Me!\" at BioNLP ACL'24. Unauthorized use or distribution of the data is prohibited.\n- Participants will ensure the confidentiality of the data and will not disclose any protected health information (PHI) if found.\n- Participants will acknowledge the use of the MIMIC-IV dataset in any publications, presentations, or reports resulting from their work in \"Discharge Me!\".\n\nWe hope you enjoy the shared task!\n\n-- Organizers of \"Discharge Me!\" @ BioNLP ACL'24"
] |
faa4713d236ad4c6344e6f4c46c0a4630ed76ef1 |
# Dataset Card for Evaluation run of kevin009/flyingllama-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kevin009/flyingllama-v2](https://huggingface.co/kevin009/flyingllama-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kevin009__flyingllama-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T09:40:33.484186](https://huggingface.co/datasets/open-llm-leaderboard/details_kevin009__flyingllama-v2/blob/main/results_2024-02-04T09-40-33.484186.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26355828648354096,
"acc_stderr": 0.030989295716946252,
"acc_norm": 0.26547327723680664,
"acc_norm_stderr": 0.031814473208964,
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.41299297017962017,
"mc2_stderr": 0.014938905945440792
},
"harness|arc:challenge|25": {
"acc": 0.2158703071672355,
"acc_stderr": 0.012022975360030672,
"acc_norm": 0.24744027303754265,
"acc_norm_stderr": 0.01261035266329267
},
"harness|hellaswag|10": {
"acc": 0.32732523401712804,
"acc_stderr": 0.004682780790508346,
"acc_norm": 0.3843855805616411,
"acc_norm_stderr": 0.004854555294017559
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03820169914517904,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03820169914517904
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.02634148037111836,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.02634148037111836
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483099,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483099
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.19574468085106383,
"acc_stderr": 0.025937853139977148,
"acc_norm": 0.19574468085106383,
"acc_norm_stderr": 0.025937853139977148
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.036001056927277716,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.036001056927277716
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400175,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400175
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523811,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523811
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2903225806451613,
"acc_stderr": 0.025822106119415888,
"acc_norm": 0.2903225806451613,
"acc_norm_stderr": 0.025822106119415888
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.03144712581678241,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.03144712581678241
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603488,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603488
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3484848484848485,
"acc_stderr": 0.033948539651564025,
"acc_norm": 0.3484848484848485,
"acc_norm_stderr": 0.033948539651564025
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.33678756476683935,
"acc_stderr": 0.034107802518361825,
"acc_norm": 0.33678756476683935,
"acc_norm_stderr": 0.034107802518361825
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.33589743589743587,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.33589743589743587,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145668,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.02684151432295895,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.02684151432295895
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.29724770642201837,
"acc_stderr": 0.019595707224643544,
"acc_norm": 0.29724770642201837,
"acc_norm_stderr": 0.019595707224643544
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.029554292605695053,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.029554292605695053
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.16591928251121077,
"acc_stderr": 0.024967553196547136,
"acc_norm": 0.16591928251121077,
"acc_norm_stderr": 0.024967553196547136
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.32231404958677684,
"acc_stderr": 0.042664163633521664,
"acc_norm": 0.32231404958677684,
"acc_norm_stderr": 0.042664163633521664
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.1875,
"acc_stderr": 0.0370468111477387,
"acc_norm": 0.1875,
"acc_norm_stderr": 0.0370468111477387
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.027236013946196663,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.027236013946196663
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2656449553001277,
"acc_stderr": 0.015794302487888726,
"acc_norm": 0.2656449553001277,
"acc_norm_stderr": 0.015794302487888726
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.022698657167855716,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.022698657167855716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02564686309713791,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02564686309713791
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19614147909967847,
"acc_stderr": 0.022552447780478026,
"acc_norm": 0.19614147909967847,
"acc_norm_stderr": 0.022552447780478026
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023132376234543346,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023132376234543346
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.02577001564429039,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.02577001564429039
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27249022164276404,
"acc_stderr": 0.011371658294311525,
"acc_norm": 0.27249022164276404,
"acc_norm_stderr": 0.011371658294311525
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.017035229258034044,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.017035229258034044
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.32653061224489793,
"acc_stderr": 0.030021056238440317,
"acc_norm": 0.32653061224489793,
"acc_norm_stderr": 0.030021056238440317
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.27860696517412936,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.27860696517412936,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.0362933532994786,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.0362933532994786
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.41299297017962017,
"mc2_stderr": 0.014938905945440792
},
"harness|winogrande|5": {
"acc": 0.5027624309392266,
"acc_stderr": 0.014052271211616438
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_kevin009__flyingllama-v2 | [
"region:us"
] | 2024-02-04T09:41:53+00:00 | {"pretty_name": "Evaluation run of kevin009/flyingllama-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [kevin009/flyingllama-v2](https://huggingface.co/kevin009/flyingllama-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kevin009__flyingllama-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T09:40:33.484186](https://huggingface.co/datasets/open-llm-leaderboard/details_kevin009__flyingllama-v2/blob/main/results_2024-02-04T09-40-33.484186.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26355828648354096,\n \"acc_stderr\": 0.030989295716946252,\n \"acc_norm\": 0.26547327723680664,\n \"acc_norm_stderr\": 0.031814473208964,\n \"mc1\": 0.24357405140758873,\n \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.41299297017962017,\n \"mc2_stderr\": 0.014938905945440792\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2158703071672355,\n \"acc_stderr\": 0.012022975360030672,\n \"acc_norm\": 0.24744027303754265,\n \"acc_norm_stderr\": 0.01261035266329267\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.32732523401712804,\n \"acc_stderr\": 0.004682780790508346,\n \"acc_norm\": 0.3843855805616411,\n \"acc_norm_stderr\": 0.004854555294017559\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03820169914517904,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03820169914517904\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.02634148037111836,\n \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.02634148037111836\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.03242414757483099,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.03242414757483099\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.19574468085106383,\n \"acc_stderr\": 0.025937853139977148,\n \"acc_norm\": 0.19574468085106383,\n \"acc_norm_stderr\": 0.025937853139977148\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.036001056927277716,\n \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.036001056927277716\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400175,\n \"acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400175\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2903225806451613,\n \"acc_stderr\": 0.025822106119415888,\n \"acc_norm\": 0.2903225806451613,\n \"acc_norm_stderr\": 0.025822106119415888\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.03144712581678241,\n \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.03144712581678241\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603488,\n \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603488\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3484848484848485,\n \"acc_stderr\": 0.033948539651564025,\n \"acc_norm\": 0.3484848484848485,\n \"acc_norm_stderr\": 0.033948539651564025\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.33678756476683935,\n \"acc_stderr\": 0.034107802518361825,\n \"acc_norm\": 0.33678756476683935,\n \"acc_norm_stderr\": 0.034107802518361825\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.33589743589743587,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.33589743589743587,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.02684151432295895,\n \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.02684151432295895\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.29724770642201837,\n \"acc_stderr\": 0.019595707224643544,\n \"acc_norm\": 0.29724770642201837,\n \"acc_norm_stderr\": 0.019595707224643544\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23039215686274508,\n \"acc_stderr\": 0.029554292605695053,\n \"acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.029554292605695053\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.16591928251121077,\n \"acc_stderr\": 0.024967553196547136,\n \"acc_norm\": 0.16591928251121077,\n \"acc_norm_stderr\": 0.024967553196547136\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.32231404958677684,\n \"acc_stderr\": 0.042664163633521664,\n \"acc_norm\": 0.32231404958677684,\n \"acc_norm_stderr\": 0.042664163633521664\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615624,\n \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615624\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.1875,\n \"acc_stderr\": 0.0370468111477387,\n \"acc_norm\": 0.1875,\n \"acc_norm_stderr\": 0.0370468111477387\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.03760178006026621,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.03760178006026621\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.027236013946196663,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.027236013946196663\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2656449553001277,\n \"acc_stderr\": 0.015794302487888726,\n \"acc_norm\": 0.2656449553001277,\n \"acc_norm_stderr\": 0.015794302487888726\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.23121387283236994,\n \"acc_stderr\": 0.022698657167855716,\n \"acc_norm\": 0.23121387283236994,\n \"acc_norm_stderr\": 0.022698657167855716\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02564686309713791,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02564686309713791\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19614147909967847,\n \"acc_stderr\": 0.022552447780478026,\n \"acc_norm\": 0.19614147909967847,\n \"acc_norm_stderr\": 0.022552447780478026\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023132376234543346,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023132376234543346\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24822695035460993,\n \"acc_stderr\": 0.02577001564429039,\n \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.02577001564429039\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27249022164276404,\n \"acc_stderr\": 0.011371658294311525,\n \"acc_norm\": 0.27249022164276404,\n \"acc_norm_stderr\": 0.011371658294311525\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.23039215686274508,\n \"acc_stderr\": 0.017035229258034044,\n \"acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.017035229258034044\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.32653061224489793,\n \"acc_stderr\": 0.030021056238440317,\n \"acc_norm\": 0.32653061224489793,\n \"acc_norm_stderr\": 0.030021056238440317\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.27860696517412936,\n \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.27860696517412936,\n \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24357405140758873,\n \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.41299297017962017,\n \"mc2_stderr\": 0.014938905945440792\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5027624309392266,\n \"acc_stderr\": 0.014052271211616438\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/kevin009/flyingllama-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|arc:challenge|25_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|gsm8k|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hellaswag|10_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T09-40-33.484186.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["**/details_harness|winogrande|5_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T09-40-33.484186.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T09_40_33.484186", "path": ["results_2024-02-04T09-40-33.484186.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T09-40-33.484186.parquet"]}]}]} | 2024-02-04T09:42:17+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of kevin009/flyingllama-v2
Dataset automatically created during the evaluation run of model kevin009/flyingllama-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T09:40:33.484186(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of kevin009/flyingllama-v2\n\n\n\nDataset automatically created during the evaluation run of model kevin009/flyingllama-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T09:40:33.484186(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kevin009/flyingllama-v2\n\n\n\nDataset automatically created during the evaluation run of model kevin009/flyingllama-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T09:40:33.484186(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7349e0969eb3e036c6147b0ac082e80ea5ea453a |
### Description
This dataset is derived from the already existing dataset made by AI4Bharat. We have used the [IndicSentiment](https://huggingface.co/datasets/ai4bharat/IndicSentiment) dataset of AI4Bharat to create an instruction style dataset.
IndicSentiment is a multilingual parallel dataset for sentiment analysis. It encompasses product reviews, translations into Indic languages, sentiment labels, and more.
The original dataset(IndicSentiment) was made available under the cc-0 license.
This dataset contains 10 split with 1150+ rows each.Each split corresponds to a language.
### Template
The following template was used for converting the original dataset:
```
#Template 1
prompt:
Translate from English to {target_language}:
{ENGLISH_REVIW}
completion:
{INDIC_REVIEW}
```
```
#Template 2
prompt:
Translate this sentence to {target_language}:
{ENGLISH_REVIW}
completion:
{INDIC_REVIEW}
```
```
#Template 3
prompt:
What's the {target_language} translation of this language:
{ENGLISH_REVIW}
completion:
{INDIC_REVIEW}
```
```
#Template 4
prompt:
Can you translate this text to {target_language}:
{ENGLISH_REVIW}
completion:
{INDIC_REVIEW}
``` | el2e10/aya-indicsentiment | [
"task_categories:conversational",
"size_categories:1K<n<10K",
"language:bn",
"language:gu",
"language:hi",
"language:kn",
"language:ml",
"language:mr",
"language:pa",
"language:ta",
"language:te",
"language:ur",
"license:cc",
"region:us"
] | 2024-02-04T09:43:38+00:00 | {"language": ["bn", "gu", "hi", "kn", "ml", "mr", "pa", "ta", "te", "ur"], "license": "cc", "size_categories": ["1K<n<10K"], "task_categories": ["conversational"], "pretty_name": "Aya-Indicsentiment", "configs": [{"config_name": "default", "data_files": [{"split": "bn", "path": "data/bn.parquet"}, {"split": "guj", "path": "data/guj.parquet"}, {"split": "hn", "path": "data/hn.parquet"}, {"split": "kn", "path": "data/kn.parquet"}, {"split": "ml", "path": "data/ml.parquet"}, {"split": "mr", "path": "data/mr.parquet"}, {"split": "pa", "path": "data/pa.parquet"}, {"split": "ta", "path": "data/ta.parquet"}, {"split": "te", "path": "data/te.parquet"}, {"split": "ur", "path": "data/ur.parquet"}]}]} | 2024-02-04T10:25:36+00:00 | [] | [
"bn",
"gu",
"hi",
"kn",
"ml",
"mr",
"pa",
"ta",
"te",
"ur"
] | TAGS
#task_categories-conversational #size_categories-1K<n<10K #language-Bengali #language-Gujarati #language-Hindi #language-Kannada #language-Malayalam #language-Marathi #language-Panjabi #language-Tamil #language-Telugu #language-Urdu #license-cc #region-us
|
### Description
This dataset is derived from the already existing dataset made by AI4Bharat. We have used the IndicSentiment dataset of AI4Bharat to create an instruction style dataset.
IndicSentiment is a multilingual parallel dataset for sentiment analysis. It encompasses product reviews, translations into Indic languages, sentiment labels, and more.
The original dataset(IndicSentiment) was made available under the cc-0 license.
This dataset contains 10 split with 1150+ rows each.Each split corresponds to a language.
### Template
The following template was used for converting the original dataset:
| [
"### Description\n\nThis dataset is derived from the already existing dataset made by AI4Bharat. We have used the IndicSentiment dataset of AI4Bharat to create an instruction style dataset. \n\nIndicSentiment is a multilingual parallel dataset for sentiment analysis. It encompasses product reviews, translations into Indic languages, sentiment labels, and more. \nThe original dataset(IndicSentiment) was made available under the cc-0 license. \n\nThis dataset contains 10 split with 1150+ rows each.Each split corresponds to a language.",
"### Template\n\nThe following template was used for converting the original dataset:"
] | [
"TAGS\n#task_categories-conversational #size_categories-1K<n<10K #language-Bengali #language-Gujarati #language-Hindi #language-Kannada #language-Malayalam #language-Marathi #language-Panjabi #language-Tamil #language-Telugu #language-Urdu #license-cc #region-us \n",
"### Description\n\nThis dataset is derived from the already existing dataset made by AI4Bharat. We have used the IndicSentiment dataset of AI4Bharat to create an instruction style dataset. \n\nIndicSentiment is a multilingual parallel dataset for sentiment analysis. It encompasses product reviews, translations into Indic languages, sentiment labels, and more. \nThe original dataset(IndicSentiment) was made available under the cc-0 license. \n\nThis dataset contains 10 split with 1150+ rows each.Each split corresponds to a language.",
"### Template\n\nThe following template was used for converting the original dataset:"
] |
fd5ee4bb161be12e061074a1a447537d761d0592 |
The text output flagging dataset of:
https://huggingface.co/spaces/Pendrokar/DeepMoji | Pendrokar/crowdsourced-deepmoji-flags | [
"license:mit",
"region:us"
] | 2024-02-04T09:49:49+00:00 | {"license": "mit", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data.csv"}]}]} | 2024-02-10T00:27:23+00:00 | [] | [] | TAGS
#license-mit #region-us
|
The text output flagging dataset of:
URL | [] | [
"TAGS\n#license-mit #region-us \n"
] |
e056c66801356f83e825038ee0554a19d794ab6f |
# Dataset Card for Evaluation run of kevin009/TinyNaughtyLlama-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kevin009/TinyNaughtyLlama-v1.0](https://huggingface.co/kevin009/TinyNaughtyLlama-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kevin009__TinyNaughtyLlama-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T09:54:29.075197](https://huggingface.co/datasets/open-llm-leaderboard/details_kevin009__TinyNaughtyLlama-v1.0/blob/main/results_2024-02-04T09-54-29.075197.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26467962102245735,
"acc_stderr": 0.031108218769225433,
"acc_norm": 0.2659055027594371,
"acc_norm_stderr": 0.03185886951972974,
"mc1": 0.23255813953488372,
"mc1_stderr": 0.0147891575310805,
"mc2": 0.36767435293950484,
"mc2_stderr": 0.013934545231005239
},
"harness|arc:challenge|25": {
"acc": 0.34044368600682595,
"acc_stderr": 0.013847460518892978,
"acc_norm": 0.35921501706484643,
"acc_norm_stderr": 0.014020224155839154
},
"harness|hellaswag|10": {
"acc": 0.45976897032463654,
"acc_stderr": 0.0049736029042478005,
"acc_norm": 0.6104361680940051,
"acc_norm_stderr": 0.004866547422355554
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2,
"acc_stderr": 0.034554737023254366,
"acc_norm": 0.2,
"acc_norm_stderr": 0.034554737023254366
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2792452830188679,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.2792452830188679,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617749,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617749
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2851063829787234,
"acc_stderr": 0.02951319662553935,
"acc_norm": 0.2851063829787234,
"acc_norm_stderr": 0.02951319662553935
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281334,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924812,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924812
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.02241804289111396,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.02241804289111396
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235173,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.26129032258064516,
"acc_stderr": 0.024993053397764815,
"acc_norm": 0.26129032258064516,
"acc_norm_stderr": 0.024993053397764815
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694433,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694433
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.296969696969697,
"acc_stderr": 0.03567969772268048,
"acc_norm": 0.296969696969697,
"acc_norm_stderr": 0.03567969772268048
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.02985751567338641,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.02985751567338641
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23316062176165803,
"acc_stderr": 0.03051611137147601,
"acc_norm": 0.23316062176165803,
"acc_norm_stderr": 0.03051611137147601
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2641025641025641,
"acc_stderr": 0.022352193737453282,
"acc_norm": 0.2641025641025641,
"acc_norm_stderr": 0.022352193737453282
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.027553614467863818,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.027553614467863818
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23853211009174313,
"acc_stderr": 0.018272575810231867,
"acc_norm": 0.23853211009174313,
"acc_norm_stderr": 0.018272575810231867
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501936,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501936
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.02830465794303531,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.02830465794303531
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3632286995515695,
"acc_stderr": 0.03227790442850499,
"acc_norm": 0.3632286995515695,
"acc_norm_stderr": 0.03227790442850499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728745,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728745
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2822085889570552,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.2822085889570552,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260597,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260597
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.27350427350427353,
"acc_stderr": 0.029202540153431166,
"acc_norm": 0.27350427350427353,
"acc_norm_stderr": 0.029202540153431166
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2822477650063857,
"acc_stderr": 0.016095302969878558,
"acc_norm": 0.2822477650063857,
"acc_norm_stderr": 0.016095302969878558
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22254335260115607,
"acc_stderr": 0.02239421566194282,
"acc_norm": 0.22254335260115607,
"acc_norm_stderr": 0.02239421566194282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.02428861946604611,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.02428861946604611
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26366559485530544,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.26366559485530544,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25617283950617287,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.25617283950617287,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880592,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880592
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2379400260756193,
"acc_stderr": 0.01087570078769424,
"acc_norm": 0.2379400260756193,
"acc_norm_stderr": 0.01087570078769424
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.027257202606114944,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.027257202606114944
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.34545454545454546,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.34545454545454546,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.15510204081632653,
"acc_stderr": 0.0231747988612186,
"acc_norm": 0.15510204081632653,
"acc_norm_stderr": 0.0231747988612186
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3072289156626506,
"acc_stderr": 0.035915667978246635,
"acc_norm": 0.3072289156626506,
"acc_norm_stderr": 0.035915667978246635
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2573099415204678,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.2573099415204678,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23255813953488372,
"mc1_stderr": 0.0147891575310805,
"mc2": 0.36767435293950484,
"mc2_stderr": 0.013934545231005239
},
"harness|winogrande|5": {
"acc": 0.6022099447513812,
"acc_stderr": 0.013755743513749025
},
"harness|gsm8k|5": {
"acc": 0.024260803639120546,
"acc_stderr": 0.0042380079000014104
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_kevin009__TinyNaughtyLlama-v1.0 | [
"region:us"
] | 2024-02-04T09:56:16+00:00 | {"pretty_name": "Evaluation run of kevin009/TinyNaughtyLlama-v1.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [kevin009/TinyNaughtyLlama-v1.0](https://huggingface.co/kevin009/TinyNaughtyLlama-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kevin009__TinyNaughtyLlama-v1.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T09:54:29.075197](https://huggingface.co/datasets/open-llm-leaderboard/details_kevin009__TinyNaughtyLlama-v1.0/blob/main/results_2024-02-04T09-54-29.075197.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26467962102245735,\n \"acc_stderr\": 0.031108218769225433,\n \"acc_norm\": 0.2659055027594371,\n \"acc_norm_stderr\": 0.03185886951972974,\n \"mc1\": 0.23255813953488372,\n \"mc1_stderr\": 0.0147891575310805,\n \"mc2\": 0.36767435293950484,\n \"mc2_stderr\": 0.013934545231005239\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.34044368600682595,\n \"acc_stderr\": 0.013847460518892978,\n \"acc_norm\": 0.35921501706484643,\n \"acc_norm_stderr\": 0.014020224155839154\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.45976897032463654,\n \"acc_stderr\": 0.0049736029042478005,\n \"acc_norm\": 0.6104361680940051,\n \"acc_norm_stderr\": 0.004866547422355554\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.034554737023254366,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.034554737023254366\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2792452830188679,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.2792452830188679,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617749,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617749\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2851063829787234,\n \"acc_stderr\": 0.02951319662553935,\n \"acc_norm\": 0.2851063829787234,\n \"acc_norm_stderr\": 0.02951319662553935\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.03999423879281334,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.03999423879281334\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924812,\n \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924812\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.02241804289111396,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.02241804289111396\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.26129032258064516,\n \"acc_stderr\": 0.024993053397764815,\n \"acc_norm\": 0.26129032258064516,\n \"acc_norm_stderr\": 0.024993053397764815\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694433,\n \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694433\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.296969696969697,\n \"acc_stderr\": 0.03567969772268048,\n \"acc_norm\": 0.296969696969697,\n \"acc_norm_stderr\": 0.03567969772268048\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.02985751567338641,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.02985751567338641\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.23316062176165803,\n \"acc_stderr\": 0.03051611137147601,\n \"acc_norm\": 0.23316062176165803,\n \"acc_norm_stderr\": 0.03051611137147601\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2641025641025641,\n \"acc_stderr\": 0.022352193737453282,\n \"acc_norm\": 0.2641025641025641,\n \"acc_norm_stderr\": 0.022352193737453282\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.027553614467863818,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.027553614467863818\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23853211009174313,\n \"acc_stderr\": 0.018272575810231867,\n \"acc_norm\": 0.23853211009174313,\n \"acc_norm_stderr\": 0.018272575810231867\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252335,\n \"acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252335\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501936,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501936\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25316455696202533,\n \"acc_stderr\": 0.02830465794303531,\n \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.02830465794303531\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3632286995515695,\n \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.3632286995515695,\n \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728745,\n \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728745\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2822085889570552,\n \"acc_stderr\": 0.03536117886664743,\n \"acc_norm\": 0.2822085889570552,\n \"acc_norm_stderr\": 0.03536117886664743\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260597,\n \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260597\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.27350427350427353,\n \"acc_stderr\": 0.029202540153431166,\n \"acc_norm\": 0.27350427350427353,\n \"acc_norm_stderr\": 0.029202540153431166\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2822477650063857,\n \"acc_stderr\": 0.016095302969878558,\n \"acc_norm\": 0.2822477650063857,\n \"acc_norm_stderr\": 0.016095302969878558\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.22254335260115607,\n \"acc_stderr\": 0.02239421566194282,\n \"acc_norm\": 0.22254335260115607,\n \"acc_norm_stderr\": 0.02239421566194282\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.02428861946604611,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02428861946604611\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26366559485530544,\n \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.26366559485530544,\n \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.25617283950617287,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.25617283950617287,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880592,\n \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880592\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2379400260756193,\n \"acc_stderr\": 0.01087570078769424,\n \"acc_norm\": 0.2379400260756193,\n \"acc_norm_stderr\": 0.01087570078769424\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.27941176470588236,\n \"acc_stderr\": 0.027257202606114944,\n \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.027257202606114944\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.34545454545454546,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.34545454545454546,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.15510204081632653,\n \"acc_stderr\": 0.0231747988612186,\n \"acc_norm\": 0.15510204081632653,\n \"acc_norm_stderr\": 0.0231747988612186\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.23383084577114427,\n \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3072289156626506,\n \"acc_stderr\": 0.035915667978246635,\n \"acc_norm\": 0.3072289156626506,\n \"acc_norm_stderr\": 0.035915667978246635\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2573099415204678,\n \"acc_stderr\": 0.03352799844161865,\n \"acc_norm\": 0.2573099415204678,\n \"acc_norm_stderr\": 0.03352799844161865\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23255813953488372,\n \"mc1_stderr\": 0.0147891575310805,\n \"mc2\": 0.36767435293950484,\n \"mc2_stderr\": 0.013934545231005239\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6022099447513812,\n \"acc_stderr\": 0.013755743513749025\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.024260803639120546,\n \"acc_stderr\": 0.0042380079000014104\n }\n}\n```", "repo_url": "https://huggingface.co/kevin009/TinyNaughtyLlama-v1.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|arc:challenge|25_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|gsm8k|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hellaswag|10_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T09-54-29.075197.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["**/details_harness|winogrande|5_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T09-54-29.075197.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T09_54_29.075197", "path": ["results_2024-02-04T09-54-29.075197.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T09-54-29.075197.parquet"]}]}]} | 2024-02-04T09:56:37+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of kevin009/TinyNaughtyLlama-v1.0
Dataset automatically created during the evaluation run of model kevin009/TinyNaughtyLlama-v1.0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T09:54:29.075197(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of kevin009/TinyNaughtyLlama-v1.0\n\n\n\nDataset automatically created during the evaluation run of model kevin009/TinyNaughtyLlama-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T09:54:29.075197(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kevin009/TinyNaughtyLlama-v1.0\n\n\n\nDataset automatically created during the evaluation run of model kevin009/TinyNaughtyLlama-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T09:54:29.075197(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
5c1ff49af3cf9b80e6648970490612ab101db92e | # Dataset Card for "pokemon_caption_data_chatgpt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | SminC/pokemon_caption_data_chatgpt | [
"region:us"
] | 2024-02-04T10:18:12+00:00 | {"dataset_info": {"features": [{"name": "original_image", "dtype": "image"}, {"name": "edit_prompt", "dtype": "string"}, {"name": "colored_image", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 69229976.0, "num_examples": 826}], "download_size": 69078894, "dataset_size": 69229976.0}} | 2024-02-12T07:38:29+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "pokemon_caption_data_chatgpt"
More Information needed | [
"# Dataset Card for \"pokemon_caption_data_chatgpt\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"pokemon_caption_data_chatgpt\"\n\nMore Information needed"
] |
098680aeb646ea5208fc9f1b685602e31890be4b |
# Evol-Instruct-Code-80k-v1
This is a cleansed version of [nickrosh/Evol-Instruct-Code-80k-v1](https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1)
## Usage
```python
from datasets import load_dataset
dataset = load_dataset("Sharathhebbar24/Evol-Instruct-Code-80k-v1", split="train")
``` | Sharathhebbar24/Evol-Instruct-Code-80k-v1 | [
"task_categories:conversational",
"task_categories:text-generation",
"size_categories:10K<n<100K",
"language:en",
"license:apache-2.0",
"code",
"region:us"
] | 2024-02-04T11:24:56+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["conversational", "text-generation"], "pretty_name": "code", "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 123241726, "num_examples": 78264}], "download_size": 52294178, "dataset_size": 123241726}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["code"]} | 2024-02-04T14:02:15+00:00 | [] | [
"en"
] | TAGS
#task_categories-conversational #task_categories-text-generation #size_categories-10K<n<100K #language-English #license-apache-2.0 #code #region-us
|
# Evol-Instruct-Code-80k-v1
This is a cleansed version of nickrosh/Evol-Instruct-Code-80k-v1
## Usage
| [
"# Evol-Instruct-Code-80k-v1\n\nThis is a cleansed version of nickrosh/Evol-Instruct-Code-80k-v1",
"## Usage"
] | [
"TAGS\n#task_categories-conversational #task_categories-text-generation #size_categories-10K<n<100K #language-English #license-apache-2.0 #code #region-us \n",
"# Evol-Instruct-Code-80k-v1\n\nThis is a cleansed version of nickrosh/Evol-Instruct-Code-80k-v1",
"## Usage"
] |
ec332798f158c523d1e0c69bda7adbd08a1bb1b4 | # Dataset Card for PochtaMarket products
### Dataset Summary
This dataset was scraped from product pages on the Russian marketplace [PochtaMarket](https://market.pochta.ru). It includes all information from the product card. The dataset was collected by processing around 500 thousand, starting from the first one. At the time the dataset was collected, it is assumed that these were all the products available on this marketplace. Some fields may be empty, but the string is expected to contain some data, empty responses have been sorted.
### Languages
The dataset is mostly in Russian, but there may be other languages present.
## Dataset Structure
### Data Fields
This dataset includes the following fields:
- `id`: Identifier for the product (integer)
- `name`: Name of the product (string)
- `description`: Short description of the product (string)
- `longDescription`: Detailed description of the product (string)
- `seoKeywords`: Search engine optimization keywords for the product (string)
- `brand`: Brand name associated with the product (string)
- `providerName`: Name of the provider or seller (string)
### Data Splits
All examples are in the train split, there is no validation split.
## Additional Information
### License
This dataset is dedicated to the public domain under the Creative Commons Zero (CC0) license. This means you can:
* Use it for any purpose, including commercial projects.
* Modify it however you like.
* Distribute it without asking permission.
No attribution is required, but it's always appreciated!
CC0 license: https://creativecommons.org/publicdomain/zero/1.0/deed.en
To learn more about CC0, visit the Creative Commons website: https://creativecommons.org/publicdomain/zero/1.0/
### Dataset Curators
- [nyuuzyou](https://ducks.party)
| nyuuzyou/PM-products | [
"task_categories:text-generation",
"task_ids:language-modeling",
"annotations_creators:crowdsourced",
"language_creators:crowdsourced",
"multilinguality:monolingual",
"size_categories:10K<n<100K",
"source_datasets:original",
"language:ru",
"license:cc0-1.0",
"region:us"
] | 2024-02-04T11:44:58+00:00 | {"annotations_creators": ["crowdsourced"], "language_creators": ["crowdsourced"], "language": ["ru"], "license": ["cc0-1.0"], "multilinguality": ["monolingual"], "size_categories": ["10K<n<100K"], "source_datasets": ["original"], "task_categories": ["text-generation"], "task_ids": ["language-modeling"], "pretty_name": "PochtaMarket products"} | 2024-02-04T11:45:50+00:00 | [] | [
"ru"
] | TAGS
#task_categories-text-generation #task_ids-language-modeling #annotations_creators-crowdsourced #language_creators-crowdsourced #multilinguality-monolingual #size_categories-10K<n<100K #source_datasets-original #language-Russian #license-cc0-1.0 #region-us
| # Dataset Card for PochtaMarket products
### Dataset Summary
This dataset was scraped from product pages on the Russian marketplace PochtaMarket. It includes all information from the product card. The dataset was collected by processing around 500 thousand, starting from the first one. At the time the dataset was collected, it is assumed that these were all the products available on this marketplace. Some fields may be empty, but the string is expected to contain some data, empty responses have been sorted.
### Languages
The dataset is mostly in Russian, but there may be other languages present.
## Dataset Structure
### Data Fields
This dataset includes the following fields:
- 'id': Identifier for the product (integer)
- 'name': Name of the product (string)
- 'description': Short description of the product (string)
- 'longDescription': Detailed description of the product (string)
- 'seoKeywords': Search engine optimization keywords for the product (string)
- 'brand': Brand name associated with the product (string)
- 'providerName': Name of the provider or seller (string)
### Data Splits
All examples are in the train split, there is no validation split.
## Additional Information
### License
This dataset is dedicated to the public domain under the Creative Commons Zero (CC0) license. This means you can:
* Use it for any purpose, including commercial projects.
* Modify it however you like.
* Distribute it without asking permission.
No attribution is required, but it's always appreciated!
CC0 license: URL
To learn more about CC0, visit the Creative Commons website: URL
### Dataset Curators
- nyuuzyou
| [
"# Dataset Card for PochtaMarket products",
"### Dataset Summary\n\nThis dataset was scraped from product pages on the Russian marketplace PochtaMarket. It includes all information from the product card. The dataset was collected by processing around 500 thousand, starting from the first one. At the time the dataset was collected, it is assumed that these were all the products available on this marketplace. Some fields may be empty, but the string is expected to contain some data, empty responses have been sorted.",
"### Languages\n\nThe dataset is mostly in Russian, but there may be other languages present.",
"## Dataset Structure",
"### Data Fields\n\nThis dataset includes the following fields:\n\n- 'id': Identifier for the product (integer)\n- 'name': Name of the product (string)\n- 'description': Short description of the product (string)\n- 'longDescription': Detailed description of the product (string)\n- 'seoKeywords': Search engine optimization keywords for the product (string)\n- 'brand': Brand name associated with the product (string)\n- 'providerName': Name of the provider or seller (string)",
"### Data Splits\n\nAll examples are in the train split, there is no validation split.",
"## Additional Information",
"### License\n\nThis dataset is dedicated to the public domain under the Creative Commons Zero (CC0) license. This means you can:\n\n* Use it for any purpose, including commercial projects.\n* Modify it however you like.\n* Distribute it without asking permission.\n\nNo attribution is required, but it's always appreciated!\n\nCC0 license: URL\n\nTo learn more about CC0, visit the Creative Commons website: URL",
"### Dataset Curators\n\n- nyuuzyou"
] | [
"TAGS\n#task_categories-text-generation #task_ids-language-modeling #annotations_creators-crowdsourced #language_creators-crowdsourced #multilinguality-monolingual #size_categories-10K<n<100K #source_datasets-original #language-Russian #license-cc0-1.0 #region-us \n",
"# Dataset Card for PochtaMarket products",
"### Dataset Summary\n\nThis dataset was scraped from product pages on the Russian marketplace PochtaMarket. It includes all information from the product card. The dataset was collected by processing around 500 thousand, starting from the first one. At the time the dataset was collected, it is assumed that these were all the products available on this marketplace. Some fields may be empty, but the string is expected to contain some data, empty responses have been sorted.",
"### Languages\n\nThe dataset is mostly in Russian, but there may be other languages present.",
"## Dataset Structure",
"### Data Fields\n\nThis dataset includes the following fields:\n\n- 'id': Identifier for the product (integer)\n- 'name': Name of the product (string)\n- 'description': Short description of the product (string)\n- 'longDescription': Detailed description of the product (string)\n- 'seoKeywords': Search engine optimization keywords for the product (string)\n- 'brand': Brand name associated with the product (string)\n- 'providerName': Name of the provider or seller (string)",
"### Data Splits\n\nAll examples are in the train split, there is no validation split.",
"## Additional Information",
"### License\n\nThis dataset is dedicated to the public domain under the Creative Commons Zero (CC0) license. This means you can:\n\n* Use it for any purpose, including commercial projects.\n* Modify it however you like.\n* Distribute it without asking permission.\n\nNo attribution is required, but it's always appreciated!\n\nCC0 license: URL\n\nTo learn more about CC0, visit the Creative Commons website: URL",
"### Dataset Curators\n\n- nyuuzyou"
] |
e474e780d4e2ec1b1fc248a5ca801f3adc8db3aa | # janitorai-cards
This dataset contains 190k cards that I received from janitorai, from a source that wished to remain anonymous.
My addition to this data is conversion of cards to [v2 character card](https://github.com/malfoyslastname/character-card-spec-v2/blob/main/README.md) format, and a local webpage that can be used to explore the dataset.
### Webpage

Ther webpage lets you browse cards, search by text, fitler by tags and order by date/name/popularity.
To use the webpage, put [index.html](index.html) into a directory, and download and extract archives into same directory: [0123.zip](0123.zip), [4567.zip](4567.zip), [89ab.zip](89ab.zip), [cdef.zip](cdef.zip), and [html.zip](html.zip).
After that, just open [index.html](index.html) in the browser.
The directory structure should look like this:
```
📁
┣━━ 📄 index.html
┣━━ 📁 cards
┃ ┣━━ 📁 0
┃ ┣━━ 📁 1
┃ ┃ ...
┃ ┗━━ 📁 f
┗━━ 📁 html
┣━━ 📄 allcards.js
┣━━ 📄 cards.js
┗━━ 📄 cardsmeta.js
```
For performance reasons, the webpage only loads 10000 most popular cards when you open it. To view all, click the "Load all" button in the top row.
Caveat: instead of downloading the card, it opens it in a new page—you have to save it yourself. I can't figure out how to get the download to work.
### Files
- [0123.zip](0123.zip), [4567.zip](4567.zip), [89ab.zip](89ab.zip), [cdef.zip](cdef.zip) - archives with v2 character cards, tested to work with SillyTavern.
- [cards-js.7z](cards-js.7z) - all v2 character cards in json format, without images, tested to work with SillyTavern.
- [index.html](index.html) - webpage for browsing cards.
- [html.zip](html.zip) - files with information about cards - it's needed for the webpage to function.
- [orig.7z](orig.7z) - original json files with cards from janitorai - not compatible with any software.
| AUTOMATIC/jaicards | [
"task_categories:conversational",
"task_categories:text-generation",
"size_categories:100K<n<1M",
"license:mit",
"region:us"
] | 2024-02-04T12:38:52+00:00 | {"license": "mit", "size_categories": ["100K<n<1M"], "task_categories": ["conversational", "text-generation"]} | 2024-02-04T13:48:48+00:00 | [] | [] | TAGS
#task_categories-conversational #task_categories-text-generation #size_categories-100K<n<1M #license-mit #region-us
| # janitorai-cards
This dataset contains 190k cards that I received from janitorai, from a source that wished to remain anonymous.
My addition to this data is conversion of cards to v2 character card format, and a local webpage that can be used to explore the dataset.
### Webpage

Ther webpage lets you browse cards, search by text, fitler by tags and order by date/name/popularity.
To use the webpage, put URL into a directory, and download and extract archives into same directory: URL, URL, URL, URL, and URL.
After that, just open URL in the browser.
The directory structure should look like this:
For performance reasons, the webpage only loads 10000 most popular cards when you open it. To view all, click the "Load all" button in the top row.
Caveat: instead of downloading the card, it opens it in a new page—you have to save it yourself. I can't figure out how to get the download to work.
### Files
- URL, URL, URL, URL - archives with v2 character cards, tested to work with SillyTavern.
- cards-js.7z - all v2 character cards in json format, without images, tested to work with SillyTavern.
- URL - webpage for browsing cards.
- URL - files with information about cards - it's needed for the webpage to function.
- orig.7z - original json files with cards from janitorai - not compatible with any software.
| [
"# janitorai-cards\n\nThis dataset contains 190k cards that I received from janitorai, from a source that wished to remain anonymous.\n\nMy addition to this data is conversion of cards to v2 character card format, and a local webpage that can be used to explore the dataset.",
"### Webpage\n\n\n\nTher webpage lets you browse cards, search by text, fitler by tags and order by date/name/popularity.\n\nTo use the webpage, put URL into a directory, and download and extract archives into same directory: URL, URL, URL, URL, and URL.\n\nAfter that, just open URL in the browser.\n\nThe directory structure should look like this:\n\n\n\nFor performance reasons, the webpage only loads 10000 most popular cards when you open it. To view all, click the \"Load all\" button in the top row.\n\nCaveat: instead of downloading the card, it opens it in a new page—you have to save it yourself. I can't figure out how to get the download to work.",
"### Files\n\n- URL, URL, URL, URL - archives with v2 character cards, tested to work with SillyTavern.\n- cards-js.7z - all v2 character cards in json format, without images, tested to work with SillyTavern.\n- URL - webpage for browsing cards.\n- URL - files with information about cards - it's needed for the webpage to function.\n- orig.7z - original json files with cards from janitorai - not compatible with any software."
] | [
"TAGS\n#task_categories-conversational #task_categories-text-generation #size_categories-100K<n<1M #license-mit #region-us \n",
"# janitorai-cards\n\nThis dataset contains 190k cards that I received from janitorai, from a source that wished to remain anonymous.\n\nMy addition to this data is conversion of cards to v2 character card format, and a local webpage that can be used to explore the dataset.",
"### Webpage\n\n\n\nTher webpage lets you browse cards, search by text, fitler by tags and order by date/name/popularity.\n\nTo use the webpage, put URL into a directory, and download and extract archives into same directory: URL, URL, URL, URL, and URL.\n\nAfter that, just open URL in the browser.\n\nThe directory structure should look like this:\n\n\n\nFor performance reasons, the webpage only loads 10000 most popular cards when you open it. To view all, click the \"Load all\" button in the top row.\n\nCaveat: instead of downloading the card, it opens it in a new page—you have to save it yourself. I can't figure out how to get the download to work.",
"### Files\n\n- URL, URL, URL, URL - archives with v2 character cards, tested to work with SillyTavern.\n- cards-js.7z - all v2 character cards in json format, without images, tested to work with SillyTavern.\n- URL - webpage for browsing cards.\n- URL - files with information about cards - it's needed for the webpage to function.\n- orig.7z - original json files with cards from janitorai - not compatible with any software."
] |
5044fe93a4045eae6b56346a898dc89b18b2b11c |
# SQL Code
This is a cleansed version of [b-mc2/sql-create-context](https://huggingface.co/datasets/b-mc2/sql-create-context)
## Usage
```python
from datasets import load_dataset
dataset = load_dataset("Sharathhebbar24/sql-create-context", split="train")
``` | Sharathhebbar24/sql-create-context | [
"task_categories:conversational",
"task_categories:text-generation",
"size_categories:10K<n<100K",
"language:en",
"license:apache-2.0",
"code",
"region:us"
] | 2024-02-04T13:49:54+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["conversational", "text-generation"], "pretty_name": "sql", "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 19952977, "num_examples": 78577}], "download_size": 6313849, "dataset_size": 19952977}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["code"]} | 2024-02-04T14:02:33+00:00 | [] | [
"en"
] | TAGS
#task_categories-conversational #task_categories-text-generation #size_categories-10K<n<100K #language-English #license-apache-2.0 #code #region-us
|
# SQL Code
This is a cleansed version of b-mc2/sql-create-context
## Usage
| [
"# SQL Code\n\nThis is a cleansed version of b-mc2/sql-create-context",
"## Usage"
] | [
"TAGS\n#task_categories-conversational #task_categories-text-generation #size_categories-10K<n<100K #language-English #license-apache-2.0 #code #region-us \n",
"# SQL Code\n\nThis is a cleansed version of b-mc2/sql-create-context",
"## Usage"
] |
e6dc27c8ff525664c39c3bbeb2e900b453d832af |
# Dataset Card for Dataset Name
This dataset was created to train a model that generates diverse image generation prompts from simple Japanese descriptions.
## Dataset Details
### Dataset Description
[STAIR Captions](https://github.com/STAIR-Lab-CIT/STAIR-captions) + [Microsoft COCO Caption](https://github.com/tylin/coco-caption) + [Microsoft COCO Images caption interpreted by UForm](https://huggingface.co/unum-cloud/uform-gen)
This dataset was created by having UForm explain Microsoft COCO Images of source Microsoft COCO based on the pair of translation inforThis dataset was created to train a model that generates diverse image generation prompts from simple Japanese descriptions.mation STAIR Captions and source information Microsoft COCO Caption. | taoki/stair-captions-prompts | [
"license:cc-by-4.0",
"region:us"
] | 2024-02-04T14:06:01+00:00 | {"license": "cc-by-4.0", "dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "file_name", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 30613024, "num_examples": 82783}, {"name": "validation", "num_bytes": 14889077, "num_examples": 40504}], "download_size": 20826123, "dataset_size": 45502101}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-02-04T14:26:16+00:00 | [] | [] | TAGS
#license-cc-by-4.0 #region-us
|
# Dataset Card for Dataset Name
This dataset was created to train a model that generates diverse image generation prompts from simple Japanese descriptions.
## Dataset Details
### Dataset Description
STAIR Captions + Microsoft COCO Caption + Microsoft COCO Images caption interpreted by UForm
This dataset was created by having UForm explain Microsoft COCO Images of source Microsoft COCO based on the pair of translation inforThis dataset was created to train a model that generates diverse image generation prompts from simple Japanese URL STAIR Captions and source information Microsoft COCO Caption. | [
"# Dataset Card for Dataset Name\n\nThis dataset was created to train a model that generates diverse image generation prompts from simple Japanese descriptions.",
"## Dataset Details",
"### Dataset Description\n\nSTAIR Captions + Microsoft COCO Caption + Microsoft COCO Images caption interpreted by UForm \n\nThis dataset was created by having UForm explain Microsoft COCO Images of source Microsoft COCO based on the pair of translation inforThis dataset was created to train a model that generates diverse image generation prompts from simple Japanese URL STAIR Captions and source information Microsoft COCO Caption."
] | [
"TAGS\n#license-cc-by-4.0 #region-us \n",
"# Dataset Card for Dataset Name\n\nThis dataset was created to train a model that generates diverse image generation prompts from simple Japanese descriptions.",
"## Dataset Details",
"### Dataset Description\n\nSTAIR Captions + Microsoft COCO Caption + Microsoft COCO Images caption interpreted by UForm \n\nThis dataset was created by having UForm explain Microsoft COCO Images of source Microsoft COCO based on the pair of translation inforThis dataset was created to train a model that generates diverse image generation prompts from simple Japanese URL STAIR Captions and source information Microsoft COCO Caption."
] |
8df2cc45aca9f5aac993086fe9dc3fcf6fc0aa66 |
# Open Platypus Code
This is a cleansed version of [garage-bAInd/Open-Platypus](https://huggingface.co/datasets/garage-bAInd/Open-Platypus)
## Usage
```python
from datasets import load_dataset
dataset = load_dataset("Sharathhebbar24/Open-Platypus", split="train")
``` | Sharathhebbar24/Open-Platypus | [
"task_categories:text-generation",
"size_categories:10K<n<100K",
"language:en",
"license:apache-2.0",
"math",
"region:us"
] | 2024-02-04T14:23:02+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["text-generation"], "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 31874014, "num_examples": 24926}], "download_size": 15340673, "dataset_size": 31874014}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["math"]} | 2024-02-04T14:27:21+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #size_categories-10K<n<100K #language-English #license-apache-2.0 #math #region-us
|
# Open Platypus Code
This is a cleansed version of garage-bAInd/Open-Platypus
## Usage
| [
"# Open Platypus Code\n\nThis is a cleansed version of garage-bAInd/Open-Platypus",
"## Usage"
] | [
"TAGS\n#task_categories-text-generation #size_categories-10K<n<100K #language-English #license-apache-2.0 #math #region-us \n",
"# Open Platypus Code\n\nThis is a cleansed version of garage-bAInd/Open-Platypus",
"## Usage"
] |
7f60e9fa9b5a76ded5e6cd2cf52c841382dbca2b | # Code-Golang-QA-2k
This (small) dataset comprises 2,000 question-and-answer entries related to the Go programming language. It is designed to serve as a resource for individuals looking to enhance machine learning models, create chatbots, or simply to provide a comprehensive knowledge base for developers working with Go.
## Data Format
```json
[
{
"question": "How do you create a new RESTful API endpoint using Gin?",
"answer": "Creating a new RESTful API endpoint using the Gin framework in Go involves several steps. The Gin framework is a high-performance HTTP web framework that allows you to build web applications and microservices efficiently. It's designed to be faster and more efficient than other frameworks, making it an excellent choice for building RESTful APIs. Here's a step-by-step guide to creating a new RESTful API endpoint using Gin.\n\n### Step 1: Install Gin\n\nFirst, you need to install the Gin package if you haven't already. You can do this by running the following command:\n\n```bash\ngo get -u github.com/gin-gonic/gin\n```\n\nThis command will download and install the Gin package and its dependencies.\n\n### Step 2: Import Gin\n\nIn your Go file where you want to create the API endpoint, import the Gin package at the beginning:\n\n```go\npackage main\n\nimport \"github.com/gin-gonic/gin\"\n```\n\n### Step 3: Create a Gin Router\n\nNext, you create a router instance which will be used to handle the HTTP requests.\n\n```go\nfunc main() {\n router := gin.Default()\n}\n```\n\n### Step 4: Define the Endpoint\n\nNow, define the RESTful API endpoint. Let's say you want to create a simple `GET` endpoint that returns a message. You would use the `GET` method of the router to define the path and the function that will handle requests to that path.\n\n```go\nfunc main() {\n router := gin.Default()\n \n router.GET(\"/message\", func(c *gin.Context) {\n c.JSON(200, gin.H{\n \"message\": \"Hello, World!\",\n })\n })\n}\n```\n\nIn this example, when a client sends a `GET` request to `/message`, the function will return a JSON response with a message.\n\n### Step 5: Run the Server\n\nFinally, you need to start the HTTP server and specify the port it should listen on. This is done using the `Run` method of the router.\n\n```go\nfunc main() {\n router := gin.Default()\n\n router.GET(\"/message\", func(c *gin.Context) {\n c.JSON(200, gin.H{\n \"message\": \"Hello, World!\",\n })\n })\n\n router.Run(\":8080\") // Listen and serve on 0.0.0.0:8080\n}\n```\n\nThis will start the server, and it will listen for requests on port 8080. You can specify a different port by changing the argument to `Run`.\n\n### Testing the Endpoint\n\nYou can test the RESTful API endpoint using a tool like `curl` or Postman. For example, to test the `/message` endpoint with `curl`, you would run:\n\n```bash\ncurl http://localhost:8080/message\n```\n\nThis should return a JSON response like:\n\n```json\n{\"message\": \"Hello, World!\"}\n```\n\n### Conclusion\n\nBy following these steps, you've created a simple RESTful API endpoint using the Gin framework in Go. Gin makes it easy to define and handle routes, making it a powerful tool for building efficient web applications and services. You can expand on this by adding more endpoints, implementing middleware for authentication or logging, and integrating with databases to create full-fledged applications."
}
...
]
``` | ExAi/Code-Golang-QA-2k | [
"size_categories:1K<n<10K",
"license:apache-2.0",
"Golang",
"Code",
"Go",
"QA",
"region:us"
] | 2024-02-04T14:52:06+00:00 | {"license": "apache-2.0", "size_categories": ["1K<n<10K"], "tags": ["Golang", "Code", "Go", "QA"]} | 2024-02-04T16:57:34+00:00 | [] | [] | TAGS
#size_categories-1K<n<10K #license-apache-2.0 #Golang #Code #Go #QA #region-us
| # Code-Golang-QA-2k
This (small) dataset comprises 2,000 question-and-answer entries related to the Go programming language. It is designed to serve as a resource for individuals looking to enhance machine learning models, create chatbots, or simply to provide a comprehensive knowledge base for developers working with Go.
## Data Format
bash\ngo get -u URL main\n\nimport \"URL main() {\n router := gin.Default()\n}\ngo\nfunc main() {\n router := gin.Default()\n \n router.GET(\"/message\", func(c *gin.Context) {\n c.JSON(200, gin.H{\n \"message\": \"Hello, World!\",\n })\n })\n}\ngo\nfunc main() {\n router := gin.Default()\n\n router.GET(\"/message\", func(c *gin.Context) {\n c.JSON(200, gin.H{\n \"message\": \"Hello, World!\",\n })\n })\n\n router.Run(\":8080\") // Listen and serve on 0.0.0.0:8080\n}\nbash\ncurl http://localhost:8080/message\njson\n{\"message\": \"Hello, World!\"}\n | [
"# Code-Golang-QA-2k\n\nThis (small) dataset comprises 2,000 question-and-answer entries related to the Go programming language. It is designed to serve as a resource for individuals looking to enhance machine learning models, create chatbots, or simply to provide a comprehensive knowledge base for developers working with Go.",
"## Data Format\n\nbash\\ngo get -u URL main\\n\\nimport \\\"URL main() {\\n router := gin.Default()\\n}\\ngo\\nfunc main() {\\n router := gin.Default()\\n \\n router.GET(\\\"/message\\\", func(c *gin.Context) {\\n c.JSON(200, gin.H{\\n \\\"message\\\": \\\"Hello, World!\\\",\\n })\\n })\\n}\\ngo\\nfunc main() {\\n router := gin.Default()\\n\\n router.GET(\\\"/message\\\", func(c *gin.Context) {\\n c.JSON(200, gin.H{\\n \\\"message\\\": \\\"Hello, World!\\\",\\n })\\n })\\n\\n router.Run(\\\":8080\\\") // Listen and serve on 0.0.0.0:8080\\n}\\nbash\\ncurl http://localhost:8080/message\\njson\\n{\\\"message\\\": \\\"Hello, World!\\\"}\\n"
] | [
"TAGS\n#size_categories-1K<n<10K #license-apache-2.0 #Golang #Code #Go #QA #region-us \n",
"# Code-Golang-QA-2k\n\nThis (small) dataset comprises 2,000 question-and-answer entries related to the Go programming language. It is designed to serve as a resource for individuals looking to enhance machine learning models, create chatbots, or simply to provide a comprehensive knowledge base for developers working with Go.",
"## Data Format\n\nbash\\ngo get -u URL main\\n\\nimport \\\"URL main() {\\n router := gin.Default()\\n}\\ngo\\nfunc main() {\\n router := gin.Default()\\n \\n router.GET(\\\"/message\\\", func(c *gin.Context) {\\n c.JSON(200, gin.H{\\n \\\"message\\\": \\\"Hello, World!\\\",\\n })\\n })\\n}\\ngo\\nfunc main() {\\n router := gin.Default()\\n\\n router.GET(\\\"/message\\\", func(c *gin.Context) {\\n c.JSON(200, gin.H{\\n \\\"message\\\": \\\"Hello, World!\\\",\\n })\\n })\\n\\n router.Run(\\\":8080\\\") // Listen and serve on 0.0.0.0:8080\\n}\\nbash\\ncurl http://localhost:8080/message\\njson\\n{\\\"message\\\": \\\"Hello, World!\\\"}\\n"
] |
ac0938bac71c51336e55784616442602c33565fc | _**Algorithm_and_Python_Source_Code**_ <br />
This dataset provides different algorithms and their corresponding source code in Python. <br /> <br />
credits: Source codes given here are taken from "iamtarun/python_code_instructions_18k_alpaca" dataset in Hugging Face. | ananyarn/Algorithm_and_Python_Source_Code | [
"language:en",
"license:apache-2.0",
"Python",
"Code Generation",
"Algorithm",
"Pseudo-code",
"Source Code",
"Programmming",
"Python Programming",
"region:us"
] | 2024-02-04T14:58:56+00:00 | {"language": ["en"], "license": "apache-2.0", "tags": ["Python", "Code Generation", "Algorithm", "Pseudo-code", "Source Code", "Programmming", "Python Programming"]} | 2024-02-05T08:28:02+00:00 | [] | [
"en"
] | TAGS
#language-English #license-apache-2.0 #Python #Code Generation #Algorithm #Pseudo-code #Source Code #Programmming #Python Programming #region-us
| _Algorithm_and_Python_Source_Code_ <br />
This dataset provides different algorithms and their corresponding source code in Python. <br /> <br />
credits: Source codes given here are taken from "iamtarun/python_code_instructions_18k_alpaca" dataset in Hugging Face. | [] | [
"TAGS\n#language-English #license-apache-2.0 #Python #Code Generation #Algorithm #Pseudo-code #Source Code #Programmming #Python Programming #region-us \n"
] |
71a1d75490d001a5d48b8e2b8f739211c61794b1 | # Dataset Card for "target-elements-0.2split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | McSpicyWithMilo/target-elements-0.2split | [
"region:us"
] | 2024-02-04T15:15:50+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "target_element", "dtype": "string"}, {"name": "instruction_type", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 36440.0, "num_examples": 320}, {"name": "test", "num_bytes": 9110.0, "num_examples": 80}], "download_size": 24201, "dataset_size": 45550.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-04T15:16:03+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "target-elements-0.2split"
More Information needed | [
"# Dataset Card for \"target-elements-0.2split\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"target-elements-0.2split\"\n\nMore Information needed"
] |
a895a2147899a4805f2c7c144eff123ce175d9af | # Dataset Card for "target-elements-0.3split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | McSpicyWithMilo/target-elements-0.3split | [
"region:us"
] | 2024-02-04T15:17:40+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "target_element", "dtype": "string"}, {"name": "instruction_type", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 31885.0, "num_examples": 280}, {"name": "test", "num_bytes": 13665.0, "num_examples": 120}], "download_size": 24258, "dataset_size": 45550.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-04T15:17:48+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "target-elements-0.3split"
More Information needed | [
"# Dataset Card for \"target-elements-0.3split\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"target-elements-0.3split\"\n\nMore Information needed"
] |
3ff61810b1822b1752e2e5454983daca2039bb37 |
if you guys find any errors please contribute
we are very open to it!
| AiHevenpen/rmvpe | [
"language:en",
"license:apache-2.0",
"music",
"code",
"region:us"
] | 2024-02-04T15:19:10+00:00 | {"language": ["en"], "license": "apache-2.0", "pretty_name": "RMVPE V2", "tags": ["music", "code"]} | 2024-02-04T15:41:58+00:00 | [] | [
"en"
] | TAGS
#language-English #license-apache-2.0 #music #code #region-us
|
if you guys find any errors please contribute
we are very open to it!
| [] | [
"TAGS\n#language-English #license-apache-2.0 #music #code #region-us \n"
] |
915349a473f4600a1fe3962cf82f4a79b40294ad | # Dataset Card for "random25eof_find_passage_train1000_eval1000_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/random25eof_find_passage_train1000_eval1000_rare | [
"region:us"
] | 2024-02-04T15:30:49+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 298700, "num_examples": 3000}, {"name": "validation", "num_bytes": 118222, "num_examples": 1000}], "download_size": 181208, "dataset_size": 416922}} | 2024-02-04T15:44:23+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "random25eof_find_passage_train1000_eval1000_rare"
More Information needed | [
"# Dataset Card for \"random25eof_find_passage_train1000_eval1000_rare\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"random25eof_find_passage_train1000_eval1000_rare\"\n\nMore Information needed"
] |
f930cae1e18d3f998cf81723db4aa0b00f6d3ce2 | # Dataset Card for "random25eof_find_passage_train5000_eval1000_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/random25eof_find_passage_train5000_eval1000_rare | [
"region:us"
] | 2024-02-04T15:31:01+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1132624, "num_examples": 11000}, {"name": "validation", "num_bytes": 118222, "num_examples": 1000}], "download_size": 452523, "dataset_size": 1250846}} | 2024-02-04T15:44:33+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "random25eof_find_passage_train5000_eval1000_rare"
More Information needed | [
"# Dataset Card for \"random25eof_find_passage_train5000_eval1000_rare\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"random25eof_find_passage_train5000_eval1000_rare\"\n\nMore Information needed"
] |
d40f007d08d52d168814df74aa6ccafe9e5d70b0 | # Dataset Card for "random25eof_find_passage_train10000_eval1000_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/random25eof_find_passage_train10000_eval1000_rare | [
"region:us"
] | 2024-02-04T15:31:09+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2174452, "num_examples": 21000}, {"name": "validation", "num_bytes": 118222, "num_examples": 1000}], "download_size": 790893, "dataset_size": 2292674}} | 2024-02-04T15:44:43+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "random25eof_find_passage_train10000_eval1000_rare"
More Information needed | [
"# Dataset Card for \"random25eof_find_passage_train10000_eval1000_rare\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"random25eof_find_passage_train10000_eval1000_rare\"\n\nMore Information needed"
] |
d46ea947d7354f0116b6d805d59efa49032edb6a | # Dataset Card for "random25eof_find_passage_train50000_eval1000_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/random25eof_find_passage_train50000_eval1000_rare | [
"region:us"
] | 2024-02-04T15:31:21+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 10511158, "num_examples": 101000}, {"name": "validation", "num_bytes": 118222, "num_examples": 1000}], "download_size": 0, "dataset_size": 10629380}} | 2024-02-04T15:44:46+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "random25eof_find_passage_train50000_eval1000_rare"
More Information needed | [
"# Dataset Card for \"random25eof_find_passage_train50000_eval1000_rare\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"random25eof_find_passage_train50000_eval1000_rare\"\n\nMore Information needed"
] |
a7a71283bdd6cb3b4f00f5836718ade643737ca6 | # Dataset Card for "random25eof_find_passage_train100000_eval1000_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/random25eof_find_passage_train100000_eval1000_rare | [
"region:us"
] | 2024-02-04T15:31:34+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 20932934, "num_examples": 201000}, {"name": "validation", "num_bytes": 118222, "num_examples": 1000}], "download_size": 0, "dataset_size": 21051156}} | 2024-02-04T15:44:49+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "random25eof_find_passage_train100000_eval1000_rare"
More Information needed | [
"# Dataset Card for \"random25eof_find_passage_train100000_eval1000_rare\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"random25eof_find_passage_train100000_eval1000_rare\"\n\nMore Information needed"
] |
83bb4884f1a04979a3e797ddb0d6494eb760a57b | # Dataset Card for "random25eof_find_passage_train500000_eval1000_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/random25eof_find_passage_train500000_eval1000_rare | [
"region:us"
] | 2024-02-04T15:31:49+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 104305810, "num_examples": 1001000}, {"name": "validation", "num_bytes": 118222, "num_examples": 1000}], "download_size": 0, "dataset_size": 104424032}} | 2024-02-04T15:44:53+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "random25eof_find_passage_train500000_eval1000_rare"
More Information needed | [
"# Dataset Card for \"random25eof_find_passage_train500000_eval1000_rare\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"random25eof_find_passage_train500000_eval1000_rare\"\n\nMore Information needed"
] |
88bca36cff5a8a53062633254346aefe406c7711 | # Dataset Card for "random25eof_find_passage_train1000000_eval1000_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/random25eof_find_passage_train1000000_eval1000_rare | [
"region:us"
] | 2024-02-04T15:32:03+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 208524730, "num_examples": 2001000}, {"name": "validation", "num_bytes": 118222, "num_examples": 1000}], "download_size": 0, "dataset_size": 208642952}} | 2024-02-04T15:44:59+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "random25eof_find_passage_train1000000_eval1000_rare"
More Information needed | [
"# Dataset Card for \"random25eof_find_passage_train1000000_eval1000_rare\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"random25eof_find_passage_train1000000_eval1000_rare\"\n\nMore Information needed"
] |
b020992ef4c91aeecdc894d788e4daa935114817 | # Dataset Card for "random25eof_find_passage_train5000000_eval1000_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/random25eof_find_passage_train5000000_eval1000_rare | [
"region:us"
] | 2024-02-04T15:32:27+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1042263000, "num_examples": 10001000}, {"name": "validation", "num_bytes": 118222, "num_examples": 1000}], "download_size": 0, "dataset_size": 1042381222}} | 2024-02-04T15:45:13+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "random25eof_find_passage_train5000000_eval1000_rare"
More Information needed | [
"# Dataset Card for \"random25eof_find_passage_train5000000_eval1000_rare\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"random25eof_find_passage_train5000000_eval1000_rare\"\n\nMore Information needed"
] |
3b35e9e80caf6dd920187c805037d922c7eea2b9 | # Dataset Card for "random25eof_find_passage_train10000000_eval1000_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/random25eof_find_passage_train10000000_eval1000_rare | [
"region:us"
] | 2024-02-04T15:33:24+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2084437584, "num_examples": 20001000}, {"name": "validation", "num_bytes": 118222, "num_examples": 1000}], "download_size": 0, "dataset_size": 2084555806}} | 2024-02-04T15:45:40+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "random25eof_find_passage_train10000000_eval1000_rare"
More Information needed | [
"# Dataset Card for \"random25eof_find_passage_train10000000_eval1000_rare\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"random25eof_find_passage_train10000000_eval1000_rare\"\n\nMore Information needed"
] |
a85c190d9ef0b126bc5646ccde590ea69ab510af | # Dataset Card for "random25eof_find_passage_train50000000_eval1000_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tyzhu/random25eof_find_passage_train50000000_eval1000_rare | [
"region:us"
] | 2024-02-04T15:35:57+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 10421821954, "num_examples": 100001000}, {"name": "validation", "num_bytes": 118222, "num_examples": 1000}], "download_size": 0, "dataset_size": 10421940176}} | 2024-02-04T15:47:37+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "random25eof_find_passage_train50000000_eval1000_rare"
More Information needed | [
"# Dataset Card for \"random25eof_find_passage_train50000000_eval1000_rare\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"random25eof_find_passage_train50000000_eval1000_rare\"\n\nMore Information needed"
] |
7cd0ac6f3aaf26fd69babc94ae320afea8e12e17 | # Examining LLM Quantization Impact
This document is a comparative analysis of qualitative performance degradation across Llama.cpp quantization within a single 2x7B model. My hope is that it will help people unfamiliar with quant impacts get a sense of how quantization will affect output.
## Headings
1. [Quants](#quants)
2. [Test Set-Up](#test-set-up)
3. [Interpretation](#interpretation)
---
## Quants
The two metrics associated with LLM quantization that a model-user will be concerned with are "perplexity" and "compression". A large-language model with a low perplexity is more likely to predict the next token (word) in a stream of text correctly. Quantizing a model results in an increase in perplexity and a reduction in size. Any increase in perplexity will compoundingly degrade writing quality with each token generated, so it's highly preferable to minimize change in perplexity (Δppl). That being said, smaller models run faster and with fewer resources.
I have ordered the quants by ascending Δppl. Δppl is copied from llama.cpp's quantize tool and shows the change to LLaMA-v1-7B. Compression shows the percentage reduced compared to f16. Calculating perplexity directly on the quants I produced is too costly for me because I am GPU poor (it would take about 5 days per quant on my computer).
| Quant | Δppl | Compression |
| --- | --- | --- |
| [F16/F32](#f16) | N/A | N/A |
| [Q8_0](#q8_0) | +0.0004 | 46.87% |
| [Q6_K](#q6_k) | +0.0008 | 58.98% |
| [Q5_K_M](#q5_k_m) | +0.0122 | 64.55% |
| [Q5_1](#q5_1) | +0.0349 | 62.46% |
| [Q5_K_S](#q5_k_s) | +0.0400 | 65.55% |
| [Q4_K_M](#q4_k_m) | +0.0532 | 69.79% |
| [Q5_0](#q5_0) | +0.0683 | 65.44% |
| [Q4_K_S](#q4_k_s) | +0.0992 | 71.50% |
| [Q4_1](#q4_1) | +0.1585 | 68.64% |
| [Q3_K_L](#q3_k_l) | +0.1764 | 73.88% |
| [Q4_0](#q4_0) | +0.2166 | 71.62% |
| [Q3_K_M](#q3_k_m) | +0.2496 | 75.91% |
| [Q3_K_S](#q3_k_s) | +0.5551 | 78.31% |
| [Q3_K_XS](#q3_k_xs) | 3-bit extra small quantization | 79.69% |
| [IQ3_XXS](#iq3_xxs) | 3.06 bpw quantization | 80.35% |
| [Q2_K](#q2_k) | +0.6717 | 81.52% |
| [Q2_K_S](#q2_k_s) | +9.0634 | 82.79% |
| [IQ2_XS](#iq2_xs) | 2.31 bpw quantization | 85.10% |
| [IQ2_XXS](#iq2_xxs) | 2.06 bpw quantization | 86.56% |
## Test Set-Up
### Model
https://huggingface.co/TomGrc/FusionNet_7Bx2_MoE_14B
### Quantization
1. Used llama.cpp:convert.py to convert the model to an f16 GGUF
2. Calculated an importance matrix using wikitext-2-raw/wiki.train.raw
3. Used llama.cpp:quantize to quantize the f16 model to to the various quants with the calculated importance matrix.
### Input
I created a file: prompts/logic_precidence_inference.txt
```
# Instructions
The following is an interaction between an AI model and a user where the user is posing a question to test the ability of the AI to make assumumptions by making inferences from the context. These are potentially riddles and to answer correctly, the AI should infer the majority case when faced with ambiguity.
# Prompt
Assistant: How can I help you today?
User: Alex is Charlie's father. Which one of them was born later?
Assistant:
```
This input was run many times to attempt to see how the answers vary. Different classifications of answers were noted while functionally equivalent answers were not.
A secondary case with 10 runs includes an additional instruction: "Furthermore, the AI is to answer tersely, not explaining it's rationale. Single word answers are optimal." *Note: 10 runs may not be enough*
### Command
```
./build/bin/main \
--n-gpu-layers -6 \
-m ~/models/FusionNet_7Bx2_MoE_14B_GGUF/TomGrc_FusionNet_7Bx2_MoE_14B_Q5_K_S.gguf \
-f prompts/logic_precidence_inference.txt \
-c 1024 \
-n 1024 \
--repeat_penalty 1.0
```
---
## f16
Non-Quanted Model
Size: 24G (25759553440)
### INCORRECT
> Considering the typical generational order between a father and his child, we can infer that Charlie was most likely born before Alex became a father. Therefore, Alex was probably born after Charlie. So, Alex was born later in comparison to the given context.
### CORRECT
> Considering the usual family structure and age dynamics between parents and their children, it is more likely that Charlie was born later than Alex. This assumption holds true unless there is any specific context given to suggest otherwise.
### Bare-word Results
```
[CORRECT] 9 - *,*,*,*,*,*,*,*,* (Charlie)
[INCORRECT] 1 - * (Alex)
```
---
## Q8_0
Performs roughly as well as the non-quantized model
Size: 13G (13685856224)
### CORRECT
> Considering the traditional family structure where parents are older than their children, it is likely that Charlie was born later than Alex.
### RIGHT ANSWER - BAD LOGIC (partially)
> Considering the given information, it's logical to assume that Charlie was born after Alex since Alex would have been Charlie's age when Charlie was born. So in most cases, Alex would have been older when Charlie was born, implying Charlie was born later relatively in comparison. But it's important to note that this assumption does not hold for all circumstances (e.g., if Alex himself was born later in life), but it's the majority case based on typical generational gaps.
### Bare-word Results
```
[CORRECT] 9 - *,*,*,*,*,*,*,*,* (Charlie)
[INCORRECT] 1 - * (Alex)
```
---
## Q6_K
Best performing, even ahead of the non-quanted model, oddly. No wrong answers or refusals were generated in long-form.
Size: 9.9G (10566817760)
### CORRECT
> Considering the usual family dynamics and age progression, it is most likely that Charlie was born later than his father Alex. This assumption is made due to limited context and the general understanding that parents are older than their children. However, there may be exceptions to this rule in certain circumstances.
### Bare-word Results
```
[CORRECT] 10 - *,*,*,*,*,*,*,*,*,* (Charlie)
[INCORRECT] 0 - Nil (Alex)
```
---
## Q5_K_M
Gave refusal and incorrect long-form answers. Bare-word performance was perfect.
Size: 8.6G (9132259296)
### REFUSAL
> Considering the given information only states a parent-child relationship between Alex and Charlie, it cannot be definitively concluded who was born later. Typically, parents are older than their children but without a specific age mentioned, we cannot make that assumption. So, in this context, it is ambiguous.
### INCORRECT
> Considering the normal course of events where parents give birth before having children themselves, it is likely that Charlie was born earlier than Alex. Therefore, Alex was born later compared to Charlie.
### CORRECT
> Considering the common order of events in life, it's usually assumed that parents are older than their children. So in this case, it can be inferred that Charlie was born later than when Alex became a father, making Charlie the one born after a later point in time compared to Alex. Hence, Charlie was likely born later.
### Bare-word Results
```
[CORRECT] 10 - *,*,*,*,*,*,*,*,*,* (Charlie)
[INCORRECT] 0 - Nil (Alex)
```
---
## Q5_1
Long form writing is slightly weakened. Bare-word results are normal.
Size: 9.1G (9670506464)
### CORRECT
> Considering the given information, it's logical to assume that when Alex is Charlie's father, Alex was born before Charlie. Fatherhood typically occurs after the child's birth. Therefore, in most cases, Charlie was born later compared to Alex.
> However, it's important to note that this assumption relies on typical life events and might not hold in every scenario.
### INCORRECT
> Considering the typical generational order, it's generally assumed that parents are older than their children. Therefore, in this case, Charlie was likely born before Alex became a father, implying Charlie was born earlier but Alex started his parental role later. So, Alex was born later relatively when compared to being Charlie's age at conception.
### Bare-word Results
```
[CORRECT] 9 - *,*,*,*,*,*,*,*,* (Charlie)
[INCORRECT] 1 - * (Alex)
```
---
## Q5_K_S
The long-form writing has become unreliable and gives predominantly non-correct answers, but is fine when correct. Bare-word results are good.
Size: 8.3G (8873785312)
### CORRECT
> Considering the typical generational relationship between a father and child, it is likely that Alex was born earlier than Charlie. So, Charlie would have been born later relatively when compared to his father Alex.
### INCORRECT
> Considering the typical parent-child relationship where parents are older than their children, it is most likely that Charlie was born earlier than Alex became a father. Hence, Alex was born later relatively compared to Charlie.
### ANSWERING A PROXIMAL QUESTION
> Considering the given information where Alex is Charlie's father, it's generally assumed that a parent is older than their child. Thus, we can infer that Alex was probably born earlier than Charlie.
### Bare-word Results
```
[CORRECT] 9 - *,*,*,*,*,*,*,*,* (Charlie)
[INCORRECT] 1 - * (Alex)
```
---
## Q4_K_M
Predominantly correct long-form answers. Perfect bare-word results. No Refusals.
Size: 7.3G (7782086624)
### CORRECT
> Considering the given information where Alex is Charlie's father, it is generally assumed that a father is older than his child. Therefore, it can be inferred that Alex was born earlier than Charlie. Hence, Charlie would have been born later compared to Alex.
### INCORRECT
> Considering the given information, it is generally assumed that parents are older than their children when they have them. Therefore, in most cases, Charlie would have been born before Alex if Alex is Charlie's father. Hence, Alex (as the father) was born later relatively compared to Charlie."
### ANSWERING_A_PROXIMAL_QUESTION
> Considering the typical order of events in a family structure, when Alex is Charlie's father, it is generally assumed that Alex was born earlier than Charlie."
### Bare-word Results
```
[CORRECT] 10 - *,*,*,*,*,*,*,*,*,* (Charlie)
[INCORRECT] 0 - Nil (Alex)
```
---
## Q5_0
Both long-form and bare-word have lost a little reliability.
Size: 8.3G (8903145440)
### CORRECT
> Considering the typical generational order where parents are older than their children, it is more likely that Charlie was born later than Alex.
### INCORRECT
> Considering the typical order of events where a child is born after their parents, it is likely that Charlie was born before Alex became his father, implying Charlie was born later relatively compared to Alex. However, without specific birth dates, this answer is subject to change if unusual circumstances exist. But generally speaking, Charlie would have been born later than when Alex became his father.
### CONFUSED
> Considering the typical family structure where a parent is older than their child, it is most likely that Charlie was born earlier than Alex became a father, implying Charlie was born later relatively but Alex was born before. So in this scenario, Alex was born earlier but Charlie was born comparatively later in their lives.
### Bare-word Results
```
[CORRECT] 8 - *,*,*,*,*,*,*,* (Charlie)
[INCORRECT] 2 - *,* (Alex)
```
---
## Q4_K_S
Good long-form behaviour. Perfect bare-word performance. Low variability observed.
Size: 6.9G (7341160416)
### CORRECT
> Considering the given context where Alex is Charlie's father, it is generally assumed that an older person becomes a parent to a younger one. Thus, we can infer that Charlie was born later than Alex.
### INCORRECT
> Considering the typical family dynamic where parents are older than their children, it is more likely that Charlie was born earlier than Alex. Hence, Alex was born later compared to Charlie in most scenarios.
### Bare-word Results
```
[CORRECT] 10 - *,*,*,*,*,*,*,*,*,* (Charlie)
[INCORRECT] 0 - Nil (Alex)
```
---
## Q4_1
In long-form, the model frequently answered a proximal question (who was born earlier) rather than the exact question. No incorrect answers observed in long-form.
Size: 7.6G (8077064160)
### CORRECT
> Considering the typical generation gap between a parent and their child, it's more likely that Alex (Charlie's father) was born earlier than Charlie. Therefore, Charlie was born later in this scenario.
### REFUSAL
> Considering the given information only states the relationship between Alex and Charlie as father and son, it does not provide enough context to determine when they were born or who was born later. Generally speaking, parents are born before their children but without a specific timeline, we cannot assume the exact order of birth here.
### RIGHT_ANSWER_BAD_LOGIC
> Considering the given information, it's most likely that Charlie was born after Alex since Alex would have been Charlie's age when Charlie was born. However, direct birth order cannot be definitively concluded without more context.
### ANSWERING_A_PROXIMAL_QUESTION
> Considering the given information where Alex is Charlie's father, it would typically imply a generational relationship with Alex being older than Charlie. Therefore, Alex was most likely born before Charlie.
### Bare-word Results
```
[CORRECT] 9 - *,*,*,*,*,*,*,*,* (Charlie)
[INCORRECT] 1 - * (Alex)
```
---
## Q3_K_L
No incorrect or refusal long-form answers. Bare-word performance was degraded somewhat.
Size: 6.3G (6729209824)
### CORRECT
> In general, when discussing parent-child relationships, it's assumed that the parent was born earlier than the child. Therefore, Charlie was likely born later than his father Alex.\n\nHowever, to provide a definitive answer, we would need more context or information about their birth dates. But given the typical parent-child relationship dynamic, it's safe to infer this scenario.
### ANSWERING_A_PROXIMAL_QUESTION
> In the majority case when considering age differences between a father and child, it's assumed that the parent (Alex in this scenario) is older than their offspring (Charlie). Therefore, Alex was most likely born earlier than Charlie.
### Bare-word Results
```
[CORRECT] 8 - *,*,*,*,*,*,*,* (Charlie)
[INCORRECT] 2 - *,* (Alex)
```
---
## Q4_0
Long-form answers were evenly distributed among classifications, meaning it did not reliably give correct answers. Bare-word performance was perfect.
Size: 6.9G (7309703136)
### CORRECT
> Considering the typical order of events, when a person becomes a parent (as Alex is to Charlie), it usually implies that Alex was born earlier. Therefore, it's more likely that Charlie was born later in this context.
### INCORRECT
> Considering the normal life progression where parents are older than their children, it can be inferred that Charlie was born earlier, thus making Alex, his father, born later. So, Alex was born later compared to Charlie.
### ANSWERING_A_PROXIMAL_QUESTION
> Considering the general progression of life, parents are typically older than their children when they give birth. Hence, it is more likely that Alex was born earlier than Charlie in this scenario.
### Bare-word Results
```
[CORRECT] 10 - *,*,*,*,*,*,*,*,*,* (Charlie)
[INCORRECT] 0 - Nil (Alex)
```
---
## Q3_K_M
Good bare-word performance, even if it wasn't all correct answers.
5.8G (6205970400)
### CORRECT
> In most cases, when a parent gives birth to a child, it's assumed that the parent was born earlier than their child. So, in this scenario where Alex is Charlie's father, it's likely that Alex was born before Charlie. Hence, Charlie was born later compared to Alex.
### ANSWERING A PROXIMAL QUESTION
> In most cases, when a person becomes a parent, it means they have already lived through their own childhood and adolescence before having a child. Therefore, it can be inferred that Alex (Charlie's father) was born earlier than Charlie.
### Bare-word Results
```
[CORRECT] 7 - *,*,*,*,*,*,* (Charlie)
[INCORRECT] 3 - *,*,* (Alex)
```
---
## Q3_K_S
Mostly correct output in long form. Mostly incorrect output in bare-word.
Size: 5.3G (5587310560)
### CORRECT
> In most cases, when a person is someone's father, it implies that the father was born before the child. Therefore, Charlie would have been born later than Alex in the majority of situations.
> However, it's important to note that exceptions can exist (for example, in cases of assisted reproduction, where the age difference is compressed, or if they are related but not biologically father and son). In general discourse and without further context, we would assume the majority case.
> So, in the assumed majority case, Charlie was born later than Alex.
### ANSWERING A PROXIMAL QUESTION
> In this scenario, it is generally assumed that when someone is referred to as a father, they are the older parent. Therefore, it is likely that Alex, being Charlie's father, was born earlier than Charlie.
> However, without further context or information, it is impossible to know for certain if there are any exceptional circumstances such as age reversal treatments or other factors that may have changed their relative ages. So, in the majority case, Alex was born earlier than Charlie.
### Bare-word Results
```
[CORRECT] 2 - *,* (Charlie)
[INCORRECT] 7 - *,*,*,*,*,*,* (Alex)
[INCORRECT+RATIONALE] 1 - *
```
---
## Q3_K_XS
No wrong answers or refusals were generated in long-form. One case where the AI attempted to answer both ways for a single individual. Bad bare-word performance.
Size: 4.9G (5230532576)
### CORRECT
> Considering the typical generational order between a parent and child, it is more likely that Alex, being Charlie's father, was born earlier than Charlie. Thus, Charlie was born later in comparison to Alex.
### SLIGHTLY HEDGED
> It is more likely that Alex was born earlier since he is Charlie's father. Generally speaking, a parent is older than their child. Therefore, we can assume that Alex was born later as a child but earlier as a parent when compared to Charlie.
### Bare-word Results
```
[CORRECT] 2 - *,* (Charlie)
[INCORRECT] 5 - *,*,*,*,* (Alex)
[INCORRECT+RATIONALE] 3 - *,*,*
```
---
## IQ3_XXS
No incorrect or refusal long-form answers. Good Bare-word performance.
Size: 4.8G (5062604768)
### CORRECT
> In this scenario, since Alex is Charlie's father, it's more likely that Alex was born earlier. Therefore, Charlie would have been born later than his father Alex.
### PROXIMAL_QUESTION
> Considering the given information only states the relationship between Alex and Charlie as father and son respectively, it is assumed that Alex is older since he is the parent of Charlie. Typically, parents are older than their children. However, this assumption depends on other factors, like if Charlie had Alex late in life or if Alex had a late start to fatherhood due to infertility treatments, that might change the scenario. But in a general case, Alex would have been born earlier than Charlie.
### bare_word_results
```
[CORRECT] 9 - *,*,*,*,*,*,*,*,* (Charlie)
[INCORRECT] 1 - * (Alex)
```
---
## Q2_K
No wrong answers or refusals were generated in long-form. Bad bare-word performance.
Size: 4.5G (4760303584)
### CORRECT
> Considering that parenthood generally happens before a child's birth, it's more likely that Charlie was born later than Alex.
### Bare-word Results
```
[CORRECT] 3 - *,*,* (Charlie)
[INCORRECT] 2 - *,* (Alex)
[INCORRECT+RATIONALE] 1 - *
[BAD_ANSWER] 4 - *,*,*,* (CHANTLEE,Chartie,Chainler)
```
---
## Q2_K_S
Answers were typically mixed-language (French, code, no spaces) and refusal. Never gave a correct response.
Size: 4.2G (4434196448)
### REFUSAL (AND FRENCH)
> même si Alex est le père de Charlie, cela ne nous donne pas d'information sur quand ils sont nés pour déterminer qui est né plus tard. On devrait savoir leur âge ou la période de leur naissance pour résoudre ce riddles.
> # Translation
> Even though Alex is Charlie's father, this does not provide us with information about when they were born to determine who was born later. In order to solve this riddle, we need to know their age or the period of their birth.
### INCORRECT
> opponentially but considering the given information but assuming Charlie is an adult with age comparable to or older than Alex's age range as parents, we can infer that Charlie was born earlier than Alex since Alex is Charlie's father.
### Bare-word Results
```
[CORRECT] 0 - Nil (Charlie)
[INCORRECT] 0 - Nil (Alex)
[INCORRECT+RATIONALE] 2 - *,*
[BAD_ANSWER+RATIONALE] 8 - *,*,*,*,*,*,*,*
```
---
## IQ2_XS
This is much better than the incoherent results of IQ2_XXS.
Size: 3.6G (3837450208)
### INCORRECT
> In this case, it is more likely assumed that Charlie was born earlier than Alex since Alex is his father, implying an age relationship where a father is usually older than his child.
### CORRECT
> In this scenario, it is generally assumed that since Alex is Charlie's father, Charlie would be Alex's younger counterpart as father-child relationships inherently involve an older parent and a younger offspring.
### Bare-word Results
```
[CORRECT] 3 - *,*,* (Charlie)
[INCORRECT] 0 - Nil (Alex)
[CORRECT+RATIONALE] 5 - *,*,*,*,*
[INCORRECT+RATIONALE] 1 - *
[BAD_ANSWER+RATIONALE] 1 - *
```
---
## IQ2_XXS
Not Recommended. There were many possible answer classifications, and it was not predictable which the quant would output.
Size: 3.3G (3462060000)
### INCORRECT
> To infer the answer correctly, we need more information about their ages, as having only the relationship between them doesn't provide enough context. However, if it is the majority case that older people tend to have children younger than themselves, we can assume that Charlie was born earlier and Alex was born later. But without more contextual details, this remains uncertain.
> Glastring ambiguity without clear ages.
### CORRECT
> To solve this riddle, we need more specific information about the age gap or birth dates between Alex and Charlie. However, by default assuming equality between their age differences as average across society today, it's more common that Charlie, being younger than Alex who is his father, was born later.
### INCOHERENT
> To assume age equality between Alex and Charlie initially remploys ambiguity but considering age limitations and generational shifts Alex (born older) would presumably be older still whereas Charlie being his son would have been born later relatively recently compared to Alex's age range.
### RIGHT ANSWER - BAD LOGIC
> To solve this riddle efficiently, we should consider the age difference and the most common scenario of generations in families. Typically, younger generations tend to have children rather than the opposite. Therefore, it's more likely that Charlie was born later than Alex. However, without specific birth dates or ages available, we cannot definitively confirm this assumption.
> Answer Late: Charlie
### ANSWERING A PROXIMAL QUESTION
> To assume age equality between Alex and Charlie when they first became father and son would make Alex older when they reunite for this query. ⠀
> conde: ❶ However capabilities of AI limiting precise knowledge of generation dates make assumptions necessary when extracting age relations between Alex and Charlie from context. ⠀
> condeconde: ₂ Peroften when a father acknowleges their child as mentioned here with "Alex is Charlie's father", it implies Alex is older as typically younger Charlie would not have gotten pregnantouver nor given birth yet. ⠀
> condecondefinal: ₂ENDORSED_STRONG_**Most likely**, Alex (the father) was born earlier than Charlie.**/code>
> In summary: Most likely Alex (the father) was born earlier than Charlie.
### NONSENSE
> TDMerstHOUTSmixedup=%removedmix%)typen=% havetypes=% havetypes=% havetypes=% havetypes=% havetypes=% notenostmix%)typen=% notenostmix%)typen=% notenestonnievey=%typenoestonniestr=%typen=% bekanfact=% bekanfact=% bekanfact=ThinkingaboutCharliesagebrained-%LMVRevertleader=%thinkwas=% thinkwas=% thinkwas=% thinking=% thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%finalanswer=% thinkingaboutCharliesagebeingolderwhenhewasbornafterCharliesagehadapositive=% finalanswermostlikelyCharliesagehadapositiveyearswhenhehimself=% finalanswermostlikelyAlexwasbornafterCharliesagehadapositive=% finalanswermostlikelyAlexgotolderwhenhewasbornafterthatyear:%RM
...and so on
### Bare-word Results
```
[CORRECT] 0 - Nil (Charlie)
[INCORRECT] 0 - Nil (Alex)
[NONSENSE] 10 - *,*,*,*,*,*,*,*,*,* (random tokens)
```
## Interpretation
* Q6_K is an excellent quant choice to make.
* IQ3_XXS is the smallest quant that has very low degradation.
* Down to Q3_K_M, the model does not become particularly broken.
* Q3_K_S and below, the quants make the model unreliable.
* K-means quants are usually superior to non-K-means quants
| christopherthompson81/quant_exploration | [
"task_categories:text-generation",
"size_categories:n<1K",
"language:en",
"license:gpl-2.0",
"region:us"
] | 2024-02-04T15:42:51+00:00 | {"language": ["en"], "license": "gpl-2.0", "size_categories": ["n<1K"], "task_categories": ["text-generation"], "pretty_name": "Examining LLM Quantization Impact"} | 2024-02-06T21:47:29+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #size_categories-n<1K #language-English #license-gpl-2.0 #region-us
| Examining LLM Quantization Impact
=================================
This document is a comparative analysis of qualitative performance degradation across URL quantization within a single 2x7B model. My hope is that it will help people unfamiliar with quant impacts get a sense of how quantization will affect output.
Headings
--------
1. Quants
2. Test Set-Up
3. Interpretation
---
Quants
------
The two metrics associated with LLM quantization that a model-user will be concerned with are "perplexity" and "compression". A large-language model with a low perplexity is more likely to predict the next token (word) in a stream of text correctly. Quantizing a model results in an increase in perplexity and a reduction in size. Any increase in perplexity will compoundingly degrade writing quality with each token generated, so it's highly preferable to minimize change in perplexity (Δppl). That being said, smaller models run faster and with fewer resources.
I have ordered the quants by ascending Δppl. Δppl is copied from URL's quantize tool and shows the change to LLaMA-v1-7B. Compression shows the percentage reduced compared to f16. Calculating perplexity directly on the quants I produced is too costly for me because I am GPU poor (it would take about 5 days per quant on my computer).
Quant: F16/F32, Δppl: N/A, Compression: N/A
Quant: Q8\_0, Δppl: +0.0004, Compression: 46.87%
Quant: Q6\_K, Δppl: +0.0008, Compression: 58.98%
Quant: Q5\_K\_M, Δppl: +0.0122, Compression: 64.55%
Quant: Q5\_1, Δppl: +0.0349, Compression: 62.46%
Quant: Q5\_K\_S, Δppl: +0.0400, Compression: 65.55%
Quant: Q4\_K\_M, Δppl: +0.0532, Compression: 69.79%
Quant: Q5\_0, Δppl: +0.0683, Compression: 65.44%
Quant: Q4\_K\_S, Δppl: +0.0992, Compression: 71.50%
Quant: Q4\_1, Δppl: +0.1585, Compression: 68.64%
Quant: Q3\_K\_L, Δppl: +0.1764, Compression: 73.88%
Quant: Q4\_0, Δppl: +0.2166, Compression: 71.62%
Quant: Q3\_K\_M, Δppl: +0.2496, Compression: 75.91%
Quant: Q3\_K\_S, Δppl: +0.5551, Compression: 78.31%
Quant: Q3\_K\_XS, Δppl: 3-bit extra small quantization, Compression: 79.69%
Quant: IQ3\_XXS, Δppl: 3.06 bpw quantization, Compression: 80.35%
Quant: Q2\_K, Δppl: +0.6717, Compression: 81.52%
Quant: Q2\_K\_S, Δppl: +9.0634, Compression: 82.79%
Quant: IQ2\_XS, Δppl: 2.31 bpw quantization, Compression: 85.10%
Quant: IQ2\_XXS, Δppl: 2.06 bpw quantization, Compression: 86.56%
Test Set-Up
-----------
### Model
URL
### Quantization
1. Used URL:URL to convert the model to an f16 GGUF
2. Calculated an importance matrix using wikitext-2-raw/URL
3. Used URL:quantize to quantize the f16 model to to the various quants with the calculated importance matrix.
### Input
I created a file: prompts/logic\_precidence\_inference.txt
This input was run many times to attempt to see how the answers vary. Different classifications of answers were noted while functionally equivalent answers were not.
A secondary case with 10 runs includes an additional instruction: "Furthermore, the AI is to answer tersely, not explaining it's rationale. Single word answers are optimal." *Note: 10 runs may not be enough*
### Command
---
f16
---
Non-Quanted Model
Size: 24G (25759553440)
### INCORRECT
>
> Considering the typical generational order between a father and his child, we can infer that Charlie was most likely born before Alex became a father. Therefore, Alex was probably born after Charlie. So, Alex was born later in comparison to the given context.
>
>
>
### CORRECT
>
> Considering the usual family structure and age dynamics between parents and their children, it is more likely that Charlie was born later than Alex. This assumption holds true unless there is any specific context given to suggest otherwise.
>
>
>
### Bare-word Results
---
Q8\_0
-----
Performs roughly as well as the non-quantized model
Size: 13G (13685856224)
### CORRECT
>
> Considering the traditional family structure where parents are older than their children, it is likely that Charlie was born later than Alex.
>
>
>
### RIGHT ANSWER - BAD LOGIC (partially)
>
> Considering the given information, it's logical to assume that Charlie was born after Alex since Alex would have been Charlie's age when Charlie was born. So in most cases, Alex would have been older when Charlie was born, implying Charlie was born later relatively in comparison. But it's important to note that this assumption does not hold for all circumstances (e.g., if Alex himself was born later in life), but it's the majority case based on typical generational gaps.
>
>
>
### Bare-word Results
---
Q6\_K
-----
Best performing, even ahead of the non-quanted model, oddly. No wrong answers or refusals were generated in long-form.
Size: 9.9G (10566817760)
### CORRECT
>
> Considering the usual family dynamics and age progression, it is most likely that Charlie was born later than his father Alex. This assumption is made due to limited context and the general understanding that parents are older than their children. However, there may be exceptions to this rule in certain circumstances.
>
>
>
### Bare-word Results
---
Q5\_K\_M
--------
Gave refusal and incorrect long-form answers. Bare-word performance was perfect.
Size: 8.6G (9132259296)
### REFUSAL
>
> Considering the given information only states a parent-child relationship between Alex and Charlie, it cannot be definitively concluded who was born later. Typically, parents are older than their children but without a specific age mentioned, we cannot make that assumption. So, in this context, it is ambiguous.
>
>
>
### INCORRECT
>
> Considering the normal course of events where parents give birth before having children themselves, it is likely that Charlie was born earlier than Alex. Therefore, Alex was born later compared to Charlie.
>
>
>
### CORRECT
>
> Considering the common order of events in life, it's usually assumed that parents are older than their children. So in this case, it can be inferred that Charlie was born later than when Alex became a father, making Charlie the one born after a later point in time compared to Alex. Hence, Charlie was likely born later.
>
>
>
### Bare-word Results
---
Q5\_1
-----
Long form writing is slightly weakened. Bare-word results are normal.
Size: 9.1G (9670506464)
### CORRECT
>
> Considering the given information, it's logical to assume that when Alex is Charlie's father, Alex was born before Charlie. Fatherhood typically occurs after the child's birth. Therefore, in most cases, Charlie was born later compared to Alex.
> However, it's important to note that this assumption relies on typical life events and might not hold in every scenario.
>
>
>
### INCORRECT
>
> Considering the typical generational order, it's generally assumed that parents are older than their children. Therefore, in this case, Charlie was likely born before Alex became a father, implying Charlie was born earlier but Alex started his parental role later. So, Alex was born later relatively when compared to being Charlie's age at conception.
>
>
>
### Bare-word Results
---
Q5\_K\_S
--------
The long-form writing has become unreliable and gives predominantly non-correct answers, but is fine when correct. Bare-word results are good.
Size: 8.3G (8873785312)
### CORRECT
>
> Considering the typical generational relationship between a father and child, it is likely that Alex was born earlier than Charlie. So, Charlie would have been born later relatively when compared to his father Alex.
>
>
>
### INCORRECT
>
> Considering the typical parent-child relationship where parents are older than their children, it is most likely that Charlie was born earlier than Alex became a father. Hence, Alex was born later relatively compared to Charlie.
>
>
>
### ANSWERING A PROXIMAL QUESTION
>
> Considering the given information where Alex is Charlie's father, it's generally assumed that a parent is older than their child. Thus, we can infer that Alex was probably born earlier than Charlie.
>
>
>
### Bare-word Results
---
Q4\_K\_M
--------
Predominantly correct long-form answers. Perfect bare-word results. No Refusals.
Size: 7.3G (7782086624)
### CORRECT
>
> Considering the given information where Alex is Charlie's father, it is generally assumed that a father is older than his child. Therefore, it can be inferred that Alex was born earlier than Charlie. Hence, Charlie would have been born later compared to Alex.
>
>
>
### INCORRECT
>
> Considering the given information, it is generally assumed that parents are older than their children when they have them. Therefore, in most cases, Charlie would have been born before Alex if Alex is Charlie's father. Hence, Alex (as the father) was born later relatively compared to Charlie."
>
>
>
### ANSWERING\_A\_PROXIMAL\_QUESTION
>
> Considering the typical order of events in a family structure, when Alex is Charlie's father, it is generally assumed that Alex was born earlier than Charlie."
>
>
>
### Bare-word Results
---
Q5\_0
-----
Both long-form and bare-word have lost a little reliability.
Size: 8.3G (8903145440)
### CORRECT
>
> Considering the typical generational order where parents are older than their children, it is more likely that Charlie was born later than Alex.
>
>
>
### INCORRECT
>
> Considering the typical order of events where a child is born after their parents, it is likely that Charlie was born before Alex became his father, implying Charlie was born later relatively compared to Alex. However, without specific birth dates, this answer is subject to change if unusual circumstances exist. But generally speaking, Charlie would have been born later than when Alex became his father.
>
>
>
### CONFUSED
>
> Considering the typical family structure where a parent is older than their child, it is most likely that Charlie was born earlier than Alex became a father, implying Charlie was born later relatively but Alex was born before. So in this scenario, Alex was born earlier but Charlie was born comparatively later in their lives.
>
>
>
### Bare-word Results
---
Q4\_K\_S
--------
Good long-form behaviour. Perfect bare-word performance. Low variability observed.
Size: 6.9G (7341160416)
### CORRECT
>
> Considering the given context where Alex is Charlie's father, it is generally assumed that an older person becomes a parent to a younger one. Thus, we can infer that Charlie was born later than Alex.
>
>
>
### INCORRECT
>
> Considering the typical family dynamic where parents are older than their children, it is more likely that Charlie was born earlier than Alex. Hence, Alex was born later compared to Charlie in most scenarios.
>
>
>
### Bare-word Results
---
Q4\_1
-----
In long-form, the model frequently answered a proximal question (who was born earlier) rather than the exact question. No incorrect answers observed in long-form.
Size: 7.6G (8077064160)
### CORRECT
>
> Considering the typical generation gap between a parent and their child, it's more likely that Alex (Charlie's father) was born earlier than Charlie. Therefore, Charlie was born later in this scenario.
>
>
>
### REFUSAL
>
> Considering the given information only states the relationship between Alex and Charlie as father and son, it does not provide enough context to determine when they were born or who was born later. Generally speaking, parents are born before their children but without a specific timeline, we cannot assume the exact order of birth here.
>
>
>
### RIGHT\_ANSWER\_BAD\_LOGIC
>
> Considering the given information, it's most likely that Charlie was born after Alex since Alex would have been Charlie's age when Charlie was born. However, direct birth order cannot be definitively concluded without more context.
>
>
>
### ANSWERING\_A\_PROXIMAL\_QUESTION
>
> Considering the given information where Alex is Charlie's father, it would typically imply a generational relationship with Alex being older than Charlie. Therefore, Alex was most likely born before Charlie.
>
>
>
### Bare-word Results
---
Q3\_K\_L
--------
No incorrect or refusal long-form answers. Bare-word performance was degraded somewhat.
Size: 6.3G (6729209824)
### CORRECT
>
> In general, when discussing parent-child relationships, it's assumed that the parent was born earlier than the child. Therefore, Charlie was likely born later than his father Alex.\n\nHowever, to provide a definitive answer, we would need more context or information about their birth dates. But given the typical parent-child relationship dynamic, it's safe to infer this scenario.
>
>
>
### ANSWERING\_A\_PROXIMAL\_QUESTION
>
> In the majority case when considering age differences between a father and child, it's assumed that the parent (Alex in this scenario) is older than their offspring (Charlie). Therefore, Alex was most likely born earlier than Charlie.
>
>
>
### Bare-word Results
---
Q4\_0
-----
Long-form answers were evenly distributed among classifications, meaning it did not reliably give correct answers. Bare-word performance was perfect.
Size: 6.9G (7309703136)
### CORRECT
>
> Considering the typical order of events, when a person becomes a parent (as Alex is to Charlie), it usually implies that Alex was born earlier. Therefore, it's more likely that Charlie was born later in this context.
>
>
>
### INCORRECT
>
> Considering the normal life progression where parents are older than their children, it can be inferred that Charlie was born earlier, thus making Alex, his father, born later. So, Alex was born later compared to Charlie.
>
>
>
### ANSWERING\_A\_PROXIMAL\_QUESTION
>
> Considering the general progression of life, parents are typically older than their children when they give birth. Hence, it is more likely that Alex was born earlier than Charlie in this scenario.
>
>
>
### Bare-word Results
---
Q3\_K\_M
--------
Good bare-word performance, even if it wasn't all correct answers.
5.8G (6205970400)
### CORRECT
>
> In most cases, when a parent gives birth to a child, it's assumed that the parent was born earlier than their child. So, in this scenario where Alex is Charlie's father, it's likely that Alex was born before Charlie. Hence, Charlie was born later compared to Alex.
>
>
>
### ANSWERING A PROXIMAL QUESTION
>
> In most cases, when a person becomes a parent, it means they have already lived through their own childhood and adolescence before having a child. Therefore, it can be inferred that Alex (Charlie's father) was born earlier than Charlie.
>
>
>
### Bare-word Results
---
Q3\_K\_S
--------
Mostly correct output in long form. Mostly incorrect output in bare-word.
Size: 5.3G (5587310560)
### CORRECT
>
> In most cases, when a person is someone's father, it implies that the father was born before the child. Therefore, Charlie would have been born later than Alex in the majority of situations.
> However, it's important to note that exceptions can exist (for example, in cases of assisted reproduction, where the age difference is compressed, or if they are related but not biologically father and son). In general discourse and without further context, we would assume the majority case.
> So, in the assumed majority case, Charlie was born later than Alex.
>
>
>
### ANSWERING A PROXIMAL QUESTION
>
> In this scenario, it is generally assumed that when someone is referred to as a father, they are the older parent. Therefore, it is likely that Alex, being Charlie's father, was born earlier than Charlie.
> However, without further context or information, it is impossible to know for certain if there are any exceptional circumstances such as age reversal treatments or other factors that may have changed their relative ages. So, in the majority case, Alex was born earlier than Charlie.
>
>
>
### Bare-word Results
---
Q3\_K\_XS
---------
No wrong answers or refusals were generated in long-form. One case where the AI attempted to answer both ways for a single individual. Bad bare-word performance.
Size: 4.9G (5230532576)
### CORRECT
>
> Considering the typical generational order between a parent and child, it is more likely that Alex, being Charlie's father, was born earlier than Charlie. Thus, Charlie was born later in comparison to Alex.
>
>
>
### SLIGHTLY HEDGED
>
> It is more likely that Alex was born earlier since he is Charlie's father. Generally speaking, a parent is older than their child. Therefore, we can assume that Alex was born later as a child but earlier as a parent when compared to Charlie.
>
>
>
### Bare-word Results
---
IQ3\_XXS
--------
No incorrect or refusal long-form answers. Good Bare-word performance.
Size: 4.8G (5062604768)
### CORRECT
>
> In this scenario, since Alex is Charlie's father, it's more likely that Alex was born earlier. Therefore, Charlie would have been born later than his father Alex.
>
>
>
### PROXIMAL\_QUESTION
>
> Considering the given information only states the relationship between Alex and Charlie as father and son respectively, it is assumed that Alex is older since he is the parent of Charlie. Typically, parents are older than their children. However, this assumption depends on other factors, like if Charlie had Alex late in life or if Alex had a late start to fatherhood due to infertility treatments, that might change the scenario. But in a general case, Alex would have been born earlier than Charlie.
>
>
>
### bare\_word\_results
---
Q2\_K
-----
No wrong answers or refusals were generated in long-form. Bad bare-word performance.
Size: 4.5G (4760303584)
### CORRECT
>
> Considering that parenthood generally happens before a child's birth, it's more likely that Charlie was born later than Alex.
>
>
>
### Bare-word Results
---
Q2\_K\_S
--------
Answers were typically mixed-language (French, code, no spaces) and refusal. Never gave a correct response.
Size: 4.2G (4434196448)
### REFUSAL (AND FRENCH)
>
> même si Alex est le père de Charlie, cela ne nous donne pas d'information sur quand ils sont nés pour déterminer qui est né plus tard. On devrait savoir leur âge ou la période de leur naissance pour résoudre ce riddles.
>
>
> Translation
> ===========
>
>
> Even though Alex is Charlie's father, this does not provide us with information about when they were born to determine who was born later. In order to solve this riddle, we need to know their age or the period of their birth.
>
>
>
### INCORRECT
>
> opponentially but considering the given information but assuming Charlie is an adult with age comparable to or older than Alex's age range as parents, we can infer that Charlie was born earlier than Alex since Alex is Charlie's father.
>
>
>
### Bare-word Results
---
IQ2\_XS
-------
This is much better than the incoherent results of IQ2\_XXS.
Size: 3.6G (3837450208)
### INCORRECT
>
> In this case, it is more likely assumed that Charlie was born earlier than Alex since Alex is his father, implying an age relationship where a father is usually older than his child.
>
>
>
### CORRECT
>
> In this scenario, it is generally assumed that since Alex is Charlie's father, Charlie would be Alex's younger counterpart as father-child relationships inherently involve an older parent and a younger offspring.
>
>
>
### Bare-word Results
---
IQ2\_XXS
--------
Not Recommended. There were many possible answer classifications, and it was not predictable which the quant would output.
Size: 3.3G (3462060000)
### INCORRECT
>
> To infer the answer correctly, we need more information about their ages, as having only the relationship between them doesn't provide enough context. However, if it is the majority case that older people tend to have children younger than themselves, we can assume that Charlie was born earlier and Alex was born later. But without more contextual details, this remains uncertain.
> Glastring ambiguity without clear ages.
>
>
>
### CORRECT
>
> To solve this riddle, we need more specific information about the age gap or birth dates between Alex and Charlie. However, by default assuming equality between their age differences as average across society today, it's more common that Charlie, being younger than Alex who is his father, was born later.
>
>
>
### INCOHERENT
>
> To assume age equality between Alex and Charlie initially remploys ambiguity but considering age limitations and generational shifts Alex (born older) would presumably be older still whereas Charlie being his son would have been born later relatively recently compared to Alex's age range.
>
>
>
### RIGHT ANSWER - BAD LOGIC
>
> To solve this riddle efficiently, we should consider the age difference and the most common scenario of generations in families. Typically, younger generations tend to have children rather than the opposite. Therefore, it's more likely that Charlie was born later than Alex. However, without specific birth dates or ages available, we cannot definitively confirm this assumption.
> Answer Late: Charlie
>
>
>
### ANSWERING A PROXIMAL QUESTION
>
> To assume age equality between Alex and Charlie when they first became father and son would make Alex older when they reunite for this query. ⠀
> conde: However capabilities of AI limiting precise knowledge of generation dates make assumptions necessary when extracting age relations between Alex and Charlie from context. ⠀
> condeconde: ₂ Peroften when a father acknowleges their child as mentioned here with "Alex is Charlie's father", it implies Alex is older as typically younger Charlie would not have gotten pregnantouver nor given birth yet. ⠀
> condecondefinal: ₂ENDORSED\_STRONG\_Most likely, Alex (the father) was born earlier than Charlie./code>
>
>
>
>
> In summary: Most likely Alex (the father) was born earlier than Charlie.
>
>
>
### NONSENSE
>
> TDMerstHOUTSmixedup=%removedmix%)typen=% havetypes=% havetypes=% havetypes=% havetypes=% havetypes=% notenostmix%)typen=% notenostmix%)typen=% notenestonnievey=%typenoestonniestr=%typen=% bekanfact=% bekanfact=% bekanfact=ThinkingaboutCharliesagebrained-%LMVRevertleader=%thinkwas=% thinkwas=% thinkwas=% thinking=% thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%finalanswer=% thinkingaboutCharliesagebeingolderwhenhewasbornafterCharliesagehadapositive=% finalanswermostlikelyCharliesagehadapositiveyearswhenhehimself=% finalanswermostlikelyAlexwasbornafterCharliesagehadapositive=% finalanswermostlikelyAlexgotolderwhenhewasbornafterthatyear:%RM
>
>
>
...and so on
### Bare-word Results
Interpretation
--------------
* Q6\_K is an excellent quant choice to make.
* IQ3\_XXS is the smallest quant that has very low degradation.
* Down to Q3\_K\_M, the model does not become particularly broken.
* Q3\_K\_S and below, the quants make the model unreliable.
* K-means quants are usually superior to non-K-means quants
| [
"### Model\n\n\nURL",
"### Quantization\n\n\n1. Used URL:URL to convert the model to an f16 GGUF\n2. Calculated an importance matrix using wikitext-2-raw/URL\n3. Used URL:quantize to quantize the f16 model to to the various quants with the calculated importance matrix.",
"### Input\n\n\nI created a file: prompts/logic\\_precidence\\_inference.txt\n\n\nThis input was run many times to attempt to see how the answers vary. Different classifications of answers were noted while functionally equivalent answers were not.\n\n\nA secondary case with 10 runs includes an additional instruction: \"Furthermore, the AI is to answer tersely, not explaining it's rationale. Single word answers are optimal.\" *Note: 10 runs may not be enough*",
"### Command\n\n\n\n\n---\n\n\nf16\n---\n\n\nNon-Quanted Model\n\n\nSize: 24G (25759553440)",
"### INCORRECT\n\n\n\n> \n> Considering the typical generational order between a father and his child, we can infer that Charlie was most likely born before Alex became a father. Therefore, Alex was probably born after Charlie. So, Alex was born later in comparison to the given context.\n> \n> \n>",
"### CORRECT\n\n\n\n> \n> Considering the usual family structure and age dynamics between parents and their children, it is more likely that Charlie was born later than Alex. This assumption holds true unless there is any specific context given to suggest otherwise.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ8\\_0\n-----\n\n\nPerforms roughly as well as the non-quantized model\n\n\nSize: 13G (13685856224)",
"### CORRECT\n\n\n\n> \n> Considering the traditional family structure where parents are older than their children, it is likely that Charlie was born later than Alex.\n> \n> \n>",
"### RIGHT ANSWER - BAD LOGIC (partially)\n\n\n\n> \n> Considering the given information, it's logical to assume that Charlie was born after Alex since Alex would have been Charlie's age when Charlie was born. So in most cases, Alex would have been older when Charlie was born, implying Charlie was born later relatively in comparison. But it's important to note that this assumption does not hold for all circumstances (e.g., if Alex himself was born later in life), but it's the majority case based on typical generational gaps.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ6\\_K\n-----\n\n\nBest performing, even ahead of the non-quanted model, oddly. No wrong answers or refusals were generated in long-form.\n\n\nSize: 9.9G (10566817760)",
"### CORRECT\n\n\n\n> \n> Considering the usual family dynamics and age progression, it is most likely that Charlie was born later than his father Alex. This assumption is made due to limited context and the general understanding that parents are older than their children. However, there may be exceptions to this rule in certain circumstances.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ5\\_K\\_M\n--------\n\n\nGave refusal and incorrect long-form answers. Bare-word performance was perfect.\n\n\nSize: 8.6G (9132259296)",
"### REFUSAL\n\n\n\n> \n> Considering the given information only states a parent-child relationship between Alex and Charlie, it cannot be definitively concluded who was born later. Typically, parents are older than their children but without a specific age mentioned, we cannot make that assumption. So, in this context, it is ambiguous.\n> \n> \n>",
"### INCORRECT\n\n\n\n> \n> Considering the normal course of events where parents give birth before having children themselves, it is likely that Charlie was born earlier than Alex. Therefore, Alex was born later compared to Charlie.\n> \n> \n>",
"### CORRECT\n\n\n\n> \n> Considering the common order of events in life, it's usually assumed that parents are older than their children. So in this case, it can be inferred that Charlie was born later than when Alex became a father, making Charlie the one born after a later point in time compared to Alex. Hence, Charlie was likely born later.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ5\\_1\n-----\n\n\nLong form writing is slightly weakened. Bare-word results are normal.\n\n\nSize: 9.1G (9670506464)",
"### CORRECT\n\n\n\n> \n> Considering the given information, it's logical to assume that when Alex is Charlie's father, Alex was born before Charlie. Fatherhood typically occurs after the child's birth. Therefore, in most cases, Charlie was born later compared to Alex.\n> However, it's important to note that this assumption relies on typical life events and might not hold in every scenario.\n> \n> \n>",
"### INCORRECT\n\n\n\n> \n> Considering the typical generational order, it's generally assumed that parents are older than their children. Therefore, in this case, Charlie was likely born before Alex became a father, implying Charlie was born earlier but Alex started his parental role later. So, Alex was born later relatively when compared to being Charlie's age at conception.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ5\\_K\\_S\n--------\n\n\nThe long-form writing has become unreliable and gives predominantly non-correct answers, but is fine when correct. Bare-word results are good.\n\n\nSize: 8.3G (8873785312)",
"### CORRECT\n\n\n\n> \n> Considering the typical generational relationship between a father and child, it is likely that Alex was born earlier than Charlie. So, Charlie would have been born later relatively when compared to his father Alex.\n> \n> \n>",
"### INCORRECT\n\n\n\n> \n> Considering the typical parent-child relationship where parents are older than their children, it is most likely that Charlie was born earlier than Alex became a father. Hence, Alex was born later relatively compared to Charlie.\n> \n> \n>",
"### ANSWERING A PROXIMAL QUESTION\n\n\n\n> \n> Considering the given information where Alex is Charlie's father, it's generally assumed that a parent is older than their child. Thus, we can infer that Alex was probably born earlier than Charlie.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ4\\_K\\_M\n--------\n\n\nPredominantly correct long-form answers. Perfect bare-word results. No Refusals.\n\n\nSize: 7.3G (7782086624)",
"### CORRECT\n\n\n\n> \n> Considering the given information where Alex is Charlie's father, it is generally assumed that a father is older than his child. Therefore, it can be inferred that Alex was born earlier than Charlie. Hence, Charlie would have been born later compared to Alex.\n> \n> \n>",
"### INCORRECT\n\n\n\n> \n> Considering the given information, it is generally assumed that parents are older than their children when they have them. Therefore, in most cases, Charlie would have been born before Alex if Alex is Charlie's father. Hence, Alex (as the father) was born later relatively compared to Charlie.\"\n> \n> \n>",
"### ANSWERING\\_A\\_PROXIMAL\\_QUESTION\n\n\n\n> \n> Considering the typical order of events in a family structure, when Alex is Charlie's father, it is generally assumed that Alex was born earlier than Charlie.\"\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ5\\_0\n-----\n\n\nBoth long-form and bare-word have lost a little reliability.\n\n\nSize: 8.3G (8903145440)",
"### CORRECT\n\n\n\n> \n> Considering the typical generational order where parents are older than their children, it is more likely that Charlie was born later than Alex.\n> \n> \n>",
"### INCORRECT\n\n\n\n> \n> Considering the typical order of events where a child is born after their parents, it is likely that Charlie was born before Alex became his father, implying Charlie was born later relatively compared to Alex. However, without specific birth dates, this answer is subject to change if unusual circumstances exist. But generally speaking, Charlie would have been born later than when Alex became his father.\n> \n> \n>",
"### CONFUSED\n\n\n\n> \n> Considering the typical family structure where a parent is older than their child, it is most likely that Charlie was born earlier than Alex became a father, implying Charlie was born later relatively but Alex was born before. So in this scenario, Alex was born earlier but Charlie was born comparatively later in their lives.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ4\\_K\\_S\n--------\n\n\nGood long-form behaviour. Perfect bare-word performance. Low variability observed.\n\n\nSize: 6.9G (7341160416)",
"### CORRECT\n\n\n\n> \n> Considering the given context where Alex is Charlie's father, it is generally assumed that an older person becomes a parent to a younger one. Thus, we can infer that Charlie was born later than Alex.\n> \n> \n>",
"### INCORRECT\n\n\n\n> \n> Considering the typical family dynamic where parents are older than their children, it is more likely that Charlie was born earlier than Alex. Hence, Alex was born later compared to Charlie in most scenarios.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ4\\_1\n-----\n\n\nIn long-form, the model frequently answered a proximal question (who was born earlier) rather than the exact question. No incorrect answers observed in long-form.\n\n\nSize: 7.6G (8077064160)",
"### CORRECT\n\n\n\n> \n> Considering the typical generation gap between a parent and their child, it's more likely that Alex (Charlie's father) was born earlier than Charlie. Therefore, Charlie was born later in this scenario.\n> \n> \n>",
"### REFUSAL\n\n\n\n> \n> Considering the given information only states the relationship between Alex and Charlie as father and son, it does not provide enough context to determine when they were born or who was born later. Generally speaking, parents are born before their children but without a specific timeline, we cannot assume the exact order of birth here.\n> \n> \n>",
"### RIGHT\\_ANSWER\\_BAD\\_LOGIC\n\n\n\n> \n> Considering the given information, it's most likely that Charlie was born after Alex since Alex would have been Charlie's age when Charlie was born. However, direct birth order cannot be definitively concluded without more context.\n> \n> \n>",
"### ANSWERING\\_A\\_PROXIMAL\\_QUESTION\n\n\n\n> \n> Considering the given information where Alex is Charlie's father, it would typically imply a generational relationship with Alex being older than Charlie. Therefore, Alex was most likely born before Charlie.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ3\\_K\\_L\n--------\n\n\nNo incorrect or refusal long-form answers. Bare-word performance was degraded somewhat.\n\n\nSize: 6.3G (6729209824)",
"### CORRECT\n\n\n\n> \n> In general, when discussing parent-child relationships, it's assumed that the parent was born earlier than the child. Therefore, Charlie was likely born later than his father Alex.\\n\\nHowever, to provide a definitive answer, we would need more context or information about their birth dates. But given the typical parent-child relationship dynamic, it's safe to infer this scenario.\n> \n> \n>",
"### ANSWERING\\_A\\_PROXIMAL\\_QUESTION\n\n\n\n> \n> In the majority case when considering age differences between a father and child, it's assumed that the parent (Alex in this scenario) is older than their offspring (Charlie). Therefore, Alex was most likely born earlier than Charlie.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ4\\_0\n-----\n\n\nLong-form answers were evenly distributed among classifications, meaning it did not reliably give correct answers. Bare-word performance was perfect.\n\n\nSize: 6.9G (7309703136)",
"### CORRECT\n\n\n\n> \n> Considering the typical order of events, when a person becomes a parent (as Alex is to Charlie), it usually implies that Alex was born earlier. Therefore, it's more likely that Charlie was born later in this context.\n> \n> \n>",
"### INCORRECT\n\n\n\n> \n> Considering the normal life progression where parents are older than their children, it can be inferred that Charlie was born earlier, thus making Alex, his father, born later. So, Alex was born later compared to Charlie.\n> \n> \n>",
"### ANSWERING\\_A\\_PROXIMAL\\_QUESTION\n\n\n\n> \n> Considering the general progression of life, parents are typically older than their children when they give birth. Hence, it is more likely that Alex was born earlier than Charlie in this scenario.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ3\\_K\\_M\n--------\n\n\nGood bare-word performance, even if it wasn't all correct answers.\n\n\n5.8G (6205970400)",
"### CORRECT\n\n\n\n> \n> In most cases, when a parent gives birth to a child, it's assumed that the parent was born earlier than their child. So, in this scenario where Alex is Charlie's father, it's likely that Alex was born before Charlie. Hence, Charlie was born later compared to Alex.\n> \n> \n>",
"### ANSWERING A PROXIMAL QUESTION\n\n\n\n> \n> In most cases, when a person becomes a parent, it means they have already lived through their own childhood and adolescence before having a child. Therefore, it can be inferred that Alex (Charlie's father) was born earlier than Charlie.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ3\\_K\\_S\n--------\n\n\nMostly correct output in long form. Mostly incorrect output in bare-word.\n\n\nSize: 5.3G (5587310560)",
"### CORRECT\n\n\n\n> \n> In most cases, when a person is someone's father, it implies that the father was born before the child. Therefore, Charlie would have been born later than Alex in the majority of situations.\n> However, it's important to note that exceptions can exist (for example, in cases of assisted reproduction, where the age difference is compressed, or if they are related but not biologically father and son). In general discourse and without further context, we would assume the majority case.\n> So, in the assumed majority case, Charlie was born later than Alex.\n> \n> \n>",
"### ANSWERING A PROXIMAL QUESTION\n\n\n\n> \n> In this scenario, it is generally assumed that when someone is referred to as a father, they are the older parent. Therefore, it is likely that Alex, being Charlie's father, was born earlier than Charlie.\n> However, without further context or information, it is impossible to know for certain if there are any exceptional circumstances such as age reversal treatments or other factors that may have changed their relative ages. So, in the majority case, Alex was born earlier than Charlie.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ3\\_K\\_XS\n---------\n\n\nNo wrong answers or refusals were generated in long-form. One case where the AI attempted to answer both ways for a single individual. Bad bare-word performance.\n\n\nSize: 4.9G (5230532576)",
"### CORRECT\n\n\n\n> \n> Considering the typical generational order between a parent and child, it is more likely that Alex, being Charlie's father, was born earlier than Charlie. Thus, Charlie was born later in comparison to Alex.\n> \n> \n>",
"### SLIGHTLY HEDGED\n\n\n\n> \n> It is more likely that Alex was born earlier since he is Charlie's father. Generally speaking, a parent is older than their child. Therefore, we can assume that Alex was born later as a child but earlier as a parent when compared to Charlie.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nIQ3\\_XXS\n--------\n\n\nNo incorrect or refusal long-form answers. Good Bare-word performance.\n\n\nSize: 4.8G (5062604768)",
"### CORRECT\n\n\n\n> \n> In this scenario, since Alex is Charlie's father, it's more likely that Alex was born earlier. Therefore, Charlie would have been born later than his father Alex.\n> \n> \n>",
"### PROXIMAL\\_QUESTION\n\n\n\n> \n> Considering the given information only states the relationship between Alex and Charlie as father and son respectively, it is assumed that Alex is older since he is the parent of Charlie. Typically, parents are older than their children. However, this assumption depends on other factors, like if Charlie had Alex late in life or if Alex had a late start to fatherhood due to infertility treatments, that might change the scenario. But in a general case, Alex would have been born earlier than Charlie.\n> \n> \n>",
"### bare\\_word\\_results\n\n\n\n\n---\n\n\nQ2\\_K\n-----\n\n\nNo wrong answers or refusals were generated in long-form. Bad bare-word performance.\n\n\nSize: 4.5G (4760303584)",
"### CORRECT\n\n\n\n> \n> Considering that parenthood generally happens before a child's birth, it's more likely that Charlie was born later than Alex.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ2\\_K\\_S\n--------\n\n\nAnswers were typically mixed-language (French, code, no spaces) and refusal. Never gave a correct response.\n\n\nSize: 4.2G (4434196448)",
"### REFUSAL (AND FRENCH)\n\n\n\n> \n> même si Alex est le père de Charlie, cela ne nous donne pas d'information sur quand ils sont nés pour déterminer qui est né plus tard. On devrait savoir leur âge ou la période de leur naissance pour résoudre ce riddles.\n> \n> \n> Translation\n> ===========\n> \n> \n> Even though Alex is Charlie's father, this does not provide us with information about when they were born to determine who was born later. In order to solve this riddle, we need to know their age or the period of their birth.\n> \n> \n>",
"### INCORRECT\n\n\n\n> \n> opponentially but considering the given information but assuming Charlie is an adult with age comparable to or older than Alex's age range as parents, we can infer that Charlie was born earlier than Alex since Alex is Charlie's father.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nIQ2\\_XS\n-------\n\n\nThis is much better than the incoherent results of IQ2\\_XXS.\n\n\nSize: 3.6G (3837450208)",
"### INCORRECT\n\n\n\n> \n> In this case, it is more likely assumed that Charlie was born earlier than Alex since Alex is his father, implying an age relationship where a father is usually older than his child.\n> \n> \n>",
"### CORRECT\n\n\n\n> \n> In this scenario, it is generally assumed that since Alex is Charlie's father, Charlie would be Alex's younger counterpart as father-child relationships inherently involve an older parent and a younger offspring.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nIQ2\\_XXS\n--------\n\n\nNot Recommended. There were many possible answer classifications, and it was not predictable which the quant would output.\n\n\nSize: 3.3G (3462060000)",
"### INCORRECT\n\n\n\n> \n> To infer the answer correctly, we need more information about their ages, as having only the relationship between them doesn't provide enough context. However, if it is the majority case that older people tend to have children younger than themselves, we can assume that Charlie was born earlier and Alex was born later. But without more contextual details, this remains uncertain.\n> Glastring ambiguity without clear ages.\n> \n> \n>",
"### CORRECT\n\n\n\n> \n> To solve this riddle, we need more specific information about the age gap or birth dates between Alex and Charlie. However, by default assuming equality between their age differences as average across society today, it's more common that Charlie, being younger than Alex who is his father, was born later.\n> \n> \n>",
"### INCOHERENT\n\n\n\n> \n> To assume age equality between Alex and Charlie initially remploys ambiguity but considering age limitations and generational shifts Alex (born older) would presumably be older still whereas Charlie being his son would have been born later relatively recently compared to Alex's age range.\n> \n> \n>",
"### RIGHT ANSWER - BAD LOGIC\n\n\n\n> \n> To solve this riddle efficiently, we should consider the age difference and the most common scenario of generations in families. Typically, younger generations tend to have children rather than the opposite. Therefore, it's more likely that Charlie was born later than Alex. However, without specific birth dates or ages available, we cannot definitively confirm this assumption.\n> Answer Late: Charlie\n> \n> \n>",
"### ANSWERING A PROXIMAL QUESTION\n\n\n\n> \n> To assume age equality between Alex and Charlie when they first became father and son would make Alex older when they reunite for this query. ⠀\n> conde: However capabilities of AI limiting precise knowledge of generation dates make assumptions necessary when extracting age relations between Alex and Charlie from context. ⠀\n> condeconde: ₂ Peroften when a father acknowleges their child as mentioned here with \"Alex is Charlie's father\", it implies Alex is older as typically younger Charlie would not have gotten pregnantouver nor given birth yet. ⠀\n> condecondefinal: ₂ENDORSED\\_STRONG\\_Most likely, Alex (the father) was born earlier than Charlie./code>\n> \n> \n> \n\n\n\n> \n> In summary: Most likely Alex (the father) was born earlier than Charlie.\n> \n> \n>",
"### NONSENSE\n\n\n\n> \n> TDMerstHOUTSmixedup=%removedmix%)typen=% havetypes=% havetypes=% havetypes=% havetypes=% havetypes=% notenostmix%)typen=% notenostmix%)typen=% notenestonnievey=%typenoestonniestr=%typen=% bekanfact=% bekanfact=% bekanfact=ThinkingaboutCharliesagebrained-%LMVRevertleader=%thinkwas=% thinkwas=% thinkwas=% thinking=% thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%finalanswer=% thinkingaboutCharliesagebeingolderwhenhewasbornafterCharliesagehadapositive=% finalanswermostlikelyCharliesagehadapositiveyearswhenhehimself=% finalanswermostlikelyAlexwasbornafterCharliesagehadapositive=% finalanswermostlikelyAlexgotolderwhenhewasbornafterthatyear:%RM\n> \n> \n> \n\n\n...and so on",
"### Bare-word Results\n\n\nInterpretation\n--------------\n\n\n* Q6\\_K is an excellent quant choice to make.\n* IQ3\\_XXS is the smallest quant that has very low degradation.\n* Down to Q3\\_K\\_M, the model does not become particularly broken.\n* Q3\\_K\\_S and below, the quants make the model unreliable.\n* K-means quants are usually superior to non-K-means quants"
] | [
"TAGS\n#task_categories-text-generation #size_categories-n<1K #language-English #license-gpl-2.0 #region-us \n",
"### Model\n\n\nURL",
"### Quantization\n\n\n1. Used URL:URL to convert the model to an f16 GGUF\n2. Calculated an importance matrix using wikitext-2-raw/URL\n3. Used URL:quantize to quantize the f16 model to to the various quants with the calculated importance matrix.",
"### Input\n\n\nI created a file: prompts/logic\\_precidence\\_inference.txt\n\n\nThis input was run many times to attempt to see how the answers vary. Different classifications of answers were noted while functionally equivalent answers were not.\n\n\nA secondary case with 10 runs includes an additional instruction: \"Furthermore, the AI is to answer tersely, not explaining it's rationale. Single word answers are optimal.\" *Note: 10 runs may not be enough*",
"### Command\n\n\n\n\n---\n\n\nf16\n---\n\n\nNon-Quanted Model\n\n\nSize: 24G (25759553440)",
"### INCORRECT\n\n\n\n> \n> Considering the typical generational order between a father and his child, we can infer that Charlie was most likely born before Alex became a father. Therefore, Alex was probably born after Charlie. So, Alex was born later in comparison to the given context.\n> \n> \n>",
"### CORRECT\n\n\n\n> \n> Considering the usual family structure and age dynamics between parents and their children, it is more likely that Charlie was born later than Alex. This assumption holds true unless there is any specific context given to suggest otherwise.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ8\\_0\n-----\n\n\nPerforms roughly as well as the non-quantized model\n\n\nSize: 13G (13685856224)",
"### CORRECT\n\n\n\n> \n> Considering the traditional family structure where parents are older than their children, it is likely that Charlie was born later than Alex.\n> \n> \n>",
"### RIGHT ANSWER - BAD LOGIC (partially)\n\n\n\n> \n> Considering the given information, it's logical to assume that Charlie was born after Alex since Alex would have been Charlie's age when Charlie was born. So in most cases, Alex would have been older when Charlie was born, implying Charlie was born later relatively in comparison. But it's important to note that this assumption does not hold for all circumstances (e.g., if Alex himself was born later in life), but it's the majority case based on typical generational gaps.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ6\\_K\n-----\n\n\nBest performing, even ahead of the non-quanted model, oddly. No wrong answers or refusals were generated in long-form.\n\n\nSize: 9.9G (10566817760)",
"### CORRECT\n\n\n\n> \n> Considering the usual family dynamics and age progression, it is most likely that Charlie was born later than his father Alex. This assumption is made due to limited context and the general understanding that parents are older than their children. However, there may be exceptions to this rule in certain circumstances.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ5\\_K\\_M\n--------\n\n\nGave refusal and incorrect long-form answers. Bare-word performance was perfect.\n\n\nSize: 8.6G (9132259296)",
"### REFUSAL\n\n\n\n> \n> Considering the given information only states a parent-child relationship between Alex and Charlie, it cannot be definitively concluded who was born later. Typically, parents are older than their children but without a specific age mentioned, we cannot make that assumption. So, in this context, it is ambiguous.\n> \n> \n>",
"### INCORRECT\n\n\n\n> \n> Considering the normal course of events where parents give birth before having children themselves, it is likely that Charlie was born earlier than Alex. Therefore, Alex was born later compared to Charlie.\n> \n> \n>",
"### CORRECT\n\n\n\n> \n> Considering the common order of events in life, it's usually assumed that parents are older than their children. So in this case, it can be inferred that Charlie was born later than when Alex became a father, making Charlie the one born after a later point in time compared to Alex. Hence, Charlie was likely born later.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ5\\_1\n-----\n\n\nLong form writing is slightly weakened. Bare-word results are normal.\n\n\nSize: 9.1G (9670506464)",
"### CORRECT\n\n\n\n> \n> Considering the given information, it's logical to assume that when Alex is Charlie's father, Alex was born before Charlie. Fatherhood typically occurs after the child's birth. Therefore, in most cases, Charlie was born later compared to Alex.\n> However, it's important to note that this assumption relies on typical life events and might not hold in every scenario.\n> \n> \n>",
"### INCORRECT\n\n\n\n> \n> Considering the typical generational order, it's generally assumed that parents are older than their children. Therefore, in this case, Charlie was likely born before Alex became a father, implying Charlie was born earlier but Alex started his parental role later. So, Alex was born later relatively when compared to being Charlie's age at conception.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ5\\_K\\_S\n--------\n\n\nThe long-form writing has become unreliable and gives predominantly non-correct answers, but is fine when correct. Bare-word results are good.\n\n\nSize: 8.3G (8873785312)",
"### CORRECT\n\n\n\n> \n> Considering the typical generational relationship between a father and child, it is likely that Alex was born earlier than Charlie. So, Charlie would have been born later relatively when compared to his father Alex.\n> \n> \n>",
"### INCORRECT\n\n\n\n> \n> Considering the typical parent-child relationship where parents are older than their children, it is most likely that Charlie was born earlier than Alex became a father. Hence, Alex was born later relatively compared to Charlie.\n> \n> \n>",
"### ANSWERING A PROXIMAL QUESTION\n\n\n\n> \n> Considering the given information where Alex is Charlie's father, it's generally assumed that a parent is older than their child. Thus, we can infer that Alex was probably born earlier than Charlie.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ4\\_K\\_M\n--------\n\n\nPredominantly correct long-form answers. Perfect bare-word results. No Refusals.\n\n\nSize: 7.3G (7782086624)",
"### CORRECT\n\n\n\n> \n> Considering the given information where Alex is Charlie's father, it is generally assumed that a father is older than his child. Therefore, it can be inferred that Alex was born earlier than Charlie. Hence, Charlie would have been born later compared to Alex.\n> \n> \n>",
"### INCORRECT\n\n\n\n> \n> Considering the given information, it is generally assumed that parents are older than their children when they have them. Therefore, in most cases, Charlie would have been born before Alex if Alex is Charlie's father. Hence, Alex (as the father) was born later relatively compared to Charlie.\"\n> \n> \n>",
"### ANSWERING\\_A\\_PROXIMAL\\_QUESTION\n\n\n\n> \n> Considering the typical order of events in a family structure, when Alex is Charlie's father, it is generally assumed that Alex was born earlier than Charlie.\"\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ5\\_0\n-----\n\n\nBoth long-form and bare-word have lost a little reliability.\n\n\nSize: 8.3G (8903145440)",
"### CORRECT\n\n\n\n> \n> Considering the typical generational order where parents are older than their children, it is more likely that Charlie was born later than Alex.\n> \n> \n>",
"### INCORRECT\n\n\n\n> \n> Considering the typical order of events where a child is born after their parents, it is likely that Charlie was born before Alex became his father, implying Charlie was born later relatively compared to Alex. However, without specific birth dates, this answer is subject to change if unusual circumstances exist. But generally speaking, Charlie would have been born later than when Alex became his father.\n> \n> \n>",
"### CONFUSED\n\n\n\n> \n> Considering the typical family structure where a parent is older than their child, it is most likely that Charlie was born earlier than Alex became a father, implying Charlie was born later relatively but Alex was born before. So in this scenario, Alex was born earlier but Charlie was born comparatively later in their lives.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ4\\_K\\_S\n--------\n\n\nGood long-form behaviour. Perfect bare-word performance. Low variability observed.\n\n\nSize: 6.9G (7341160416)",
"### CORRECT\n\n\n\n> \n> Considering the given context where Alex is Charlie's father, it is generally assumed that an older person becomes a parent to a younger one. Thus, we can infer that Charlie was born later than Alex.\n> \n> \n>",
"### INCORRECT\n\n\n\n> \n> Considering the typical family dynamic where parents are older than their children, it is more likely that Charlie was born earlier than Alex. Hence, Alex was born later compared to Charlie in most scenarios.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ4\\_1\n-----\n\n\nIn long-form, the model frequently answered a proximal question (who was born earlier) rather than the exact question. No incorrect answers observed in long-form.\n\n\nSize: 7.6G (8077064160)",
"### CORRECT\n\n\n\n> \n> Considering the typical generation gap between a parent and their child, it's more likely that Alex (Charlie's father) was born earlier than Charlie. Therefore, Charlie was born later in this scenario.\n> \n> \n>",
"### REFUSAL\n\n\n\n> \n> Considering the given information only states the relationship between Alex and Charlie as father and son, it does not provide enough context to determine when they were born or who was born later. Generally speaking, parents are born before their children but without a specific timeline, we cannot assume the exact order of birth here.\n> \n> \n>",
"### RIGHT\\_ANSWER\\_BAD\\_LOGIC\n\n\n\n> \n> Considering the given information, it's most likely that Charlie was born after Alex since Alex would have been Charlie's age when Charlie was born. However, direct birth order cannot be definitively concluded without more context.\n> \n> \n>",
"### ANSWERING\\_A\\_PROXIMAL\\_QUESTION\n\n\n\n> \n> Considering the given information where Alex is Charlie's father, it would typically imply a generational relationship with Alex being older than Charlie. Therefore, Alex was most likely born before Charlie.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ3\\_K\\_L\n--------\n\n\nNo incorrect or refusal long-form answers. Bare-word performance was degraded somewhat.\n\n\nSize: 6.3G (6729209824)",
"### CORRECT\n\n\n\n> \n> In general, when discussing parent-child relationships, it's assumed that the parent was born earlier than the child. Therefore, Charlie was likely born later than his father Alex.\\n\\nHowever, to provide a definitive answer, we would need more context or information about their birth dates. But given the typical parent-child relationship dynamic, it's safe to infer this scenario.\n> \n> \n>",
"### ANSWERING\\_A\\_PROXIMAL\\_QUESTION\n\n\n\n> \n> In the majority case when considering age differences between a father and child, it's assumed that the parent (Alex in this scenario) is older than their offspring (Charlie). Therefore, Alex was most likely born earlier than Charlie.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ4\\_0\n-----\n\n\nLong-form answers were evenly distributed among classifications, meaning it did not reliably give correct answers. Bare-word performance was perfect.\n\n\nSize: 6.9G (7309703136)",
"### CORRECT\n\n\n\n> \n> Considering the typical order of events, when a person becomes a parent (as Alex is to Charlie), it usually implies that Alex was born earlier. Therefore, it's more likely that Charlie was born later in this context.\n> \n> \n>",
"### INCORRECT\n\n\n\n> \n> Considering the normal life progression where parents are older than their children, it can be inferred that Charlie was born earlier, thus making Alex, his father, born later. So, Alex was born later compared to Charlie.\n> \n> \n>",
"### ANSWERING\\_A\\_PROXIMAL\\_QUESTION\n\n\n\n> \n> Considering the general progression of life, parents are typically older than their children when they give birth. Hence, it is more likely that Alex was born earlier than Charlie in this scenario.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ3\\_K\\_M\n--------\n\n\nGood bare-word performance, even if it wasn't all correct answers.\n\n\n5.8G (6205970400)",
"### CORRECT\n\n\n\n> \n> In most cases, when a parent gives birth to a child, it's assumed that the parent was born earlier than their child. So, in this scenario where Alex is Charlie's father, it's likely that Alex was born before Charlie. Hence, Charlie was born later compared to Alex.\n> \n> \n>",
"### ANSWERING A PROXIMAL QUESTION\n\n\n\n> \n> In most cases, when a person becomes a parent, it means they have already lived through their own childhood and adolescence before having a child. Therefore, it can be inferred that Alex (Charlie's father) was born earlier than Charlie.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ3\\_K\\_S\n--------\n\n\nMostly correct output in long form. Mostly incorrect output in bare-word.\n\n\nSize: 5.3G (5587310560)",
"### CORRECT\n\n\n\n> \n> In most cases, when a person is someone's father, it implies that the father was born before the child. Therefore, Charlie would have been born later than Alex in the majority of situations.\n> However, it's important to note that exceptions can exist (for example, in cases of assisted reproduction, where the age difference is compressed, or if they are related but not biologically father and son). In general discourse and without further context, we would assume the majority case.\n> So, in the assumed majority case, Charlie was born later than Alex.\n> \n> \n>",
"### ANSWERING A PROXIMAL QUESTION\n\n\n\n> \n> In this scenario, it is generally assumed that when someone is referred to as a father, they are the older parent. Therefore, it is likely that Alex, being Charlie's father, was born earlier than Charlie.\n> However, without further context or information, it is impossible to know for certain if there are any exceptional circumstances such as age reversal treatments or other factors that may have changed their relative ages. So, in the majority case, Alex was born earlier than Charlie.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ3\\_K\\_XS\n---------\n\n\nNo wrong answers or refusals were generated in long-form. One case where the AI attempted to answer both ways for a single individual. Bad bare-word performance.\n\n\nSize: 4.9G (5230532576)",
"### CORRECT\n\n\n\n> \n> Considering the typical generational order between a parent and child, it is more likely that Alex, being Charlie's father, was born earlier than Charlie. Thus, Charlie was born later in comparison to Alex.\n> \n> \n>",
"### SLIGHTLY HEDGED\n\n\n\n> \n> It is more likely that Alex was born earlier since he is Charlie's father. Generally speaking, a parent is older than their child. Therefore, we can assume that Alex was born later as a child but earlier as a parent when compared to Charlie.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nIQ3\\_XXS\n--------\n\n\nNo incorrect or refusal long-form answers. Good Bare-word performance.\n\n\nSize: 4.8G (5062604768)",
"### CORRECT\n\n\n\n> \n> In this scenario, since Alex is Charlie's father, it's more likely that Alex was born earlier. Therefore, Charlie would have been born later than his father Alex.\n> \n> \n>",
"### PROXIMAL\\_QUESTION\n\n\n\n> \n> Considering the given information only states the relationship between Alex and Charlie as father and son respectively, it is assumed that Alex is older since he is the parent of Charlie. Typically, parents are older than their children. However, this assumption depends on other factors, like if Charlie had Alex late in life or if Alex had a late start to fatherhood due to infertility treatments, that might change the scenario. But in a general case, Alex would have been born earlier than Charlie.\n> \n> \n>",
"### bare\\_word\\_results\n\n\n\n\n---\n\n\nQ2\\_K\n-----\n\n\nNo wrong answers or refusals were generated in long-form. Bad bare-word performance.\n\n\nSize: 4.5G (4760303584)",
"### CORRECT\n\n\n\n> \n> Considering that parenthood generally happens before a child's birth, it's more likely that Charlie was born later than Alex.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nQ2\\_K\\_S\n--------\n\n\nAnswers were typically mixed-language (French, code, no spaces) and refusal. Never gave a correct response.\n\n\nSize: 4.2G (4434196448)",
"### REFUSAL (AND FRENCH)\n\n\n\n> \n> même si Alex est le père de Charlie, cela ne nous donne pas d'information sur quand ils sont nés pour déterminer qui est né plus tard. On devrait savoir leur âge ou la période de leur naissance pour résoudre ce riddles.\n> \n> \n> Translation\n> ===========\n> \n> \n> Even though Alex is Charlie's father, this does not provide us with information about when they were born to determine who was born later. In order to solve this riddle, we need to know their age or the period of their birth.\n> \n> \n>",
"### INCORRECT\n\n\n\n> \n> opponentially but considering the given information but assuming Charlie is an adult with age comparable to or older than Alex's age range as parents, we can infer that Charlie was born earlier than Alex since Alex is Charlie's father.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nIQ2\\_XS\n-------\n\n\nThis is much better than the incoherent results of IQ2\\_XXS.\n\n\nSize: 3.6G (3837450208)",
"### INCORRECT\n\n\n\n> \n> In this case, it is more likely assumed that Charlie was born earlier than Alex since Alex is his father, implying an age relationship where a father is usually older than his child.\n> \n> \n>",
"### CORRECT\n\n\n\n> \n> In this scenario, it is generally assumed that since Alex is Charlie's father, Charlie would be Alex's younger counterpart as father-child relationships inherently involve an older parent and a younger offspring.\n> \n> \n>",
"### Bare-word Results\n\n\n\n\n---\n\n\nIQ2\\_XXS\n--------\n\n\nNot Recommended. There were many possible answer classifications, and it was not predictable which the quant would output.\n\n\nSize: 3.3G (3462060000)",
"### INCORRECT\n\n\n\n> \n> To infer the answer correctly, we need more information about their ages, as having only the relationship between them doesn't provide enough context. However, if it is the majority case that older people tend to have children younger than themselves, we can assume that Charlie was born earlier and Alex was born later. But without more contextual details, this remains uncertain.\n> Glastring ambiguity without clear ages.\n> \n> \n>",
"### CORRECT\n\n\n\n> \n> To solve this riddle, we need more specific information about the age gap or birth dates between Alex and Charlie. However, by default assuming equality between their age differences as average across society today, it's more common that Charlie, being younger than Alex who is his father, was born later.\n> \n> \n>",
"### INCOHERENT\n\n\n\n> \n> To assume age equality between Alex and Charlie initially remploys ambiguity but considering age limitations and generational shifts Alex (born older) would presumably be older still whereas Charlie being his son would have been born later relatively recently compared to Alex's age range.\n> \n> \n>",
"### RIGHT ANSWER - BAD LOGIC\n\n\n\n> \n> To solve this riddle efficiently, we should consider the age difference and the most common scenario of generations in families. Typically, younger generations tend to have children rather than the opposite. Therefore, it's more likely that Charlie was born later than Alex. However, without specific birth dates or ages available, we cannot definitively confirm this assumption.\n> Answer Late: Charlie\n> \n> \n>",
"### ANSWERING A PROXIMAL QUESTION\n\n\n\n> \n> To assume age equality between Alex and Charlie when they first became father and son would make Alex older when they reunite for this query. ⠀\n> conde: However capabilities of AI limiting precise knowledge of generation dates make assumptions necessary when extracting age relations between Alex and Charlie from context. ⠀\n> condeconde: ₂ Peroften when a father acknowleges their child as mentioned here with \"Alex is Charlie's father\", it implies Alex is older as typically younger Charlie would not have gotten pregnantouver nor given birth yet. ⠀\n> condecondefinal: ₂ENDORSED\\_STRONG\\_Most likely, Alex (the father) was born earlier than Charlie./code>\n> \n> \n> \n\n\n\n> \n> In summary: Most likely Alex (the father) was born earlier than Charlie.\n> \n> \n>",
"### NONSENSE\n\n\n\n> \n> TDMerstHOUTSmixedup=%removedmix%)typen=% havetypes=% havetypes=% havetypes=% havetypes=% havetypes=% notenostmix%)typen=% notenostmix%)typen=% notenestonnievey=%typenoestonniestr=%typen=% bekanfact=% bekanfact=% bekanfact=ThinkingaboutCharliesagebrained-%LMVRevertleader=%thinkwas=% thinkwas=% thinkwas=% thinking=% thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%thinkwas>%finalanswer=% thinkingaboutCharliesagebeingolderwhenhewasbornafterCharliesagehadapositive=% finalanswermostlikelyCharliesagehadapositiveyearswhenhehimself=% finalanswermostlikelyAlexwasbornafterCharliesagehadapositive=% finalanswermostlikelyAlexgotolderwhenhewasbornafterthatyear:%RM\n> \n> \n> \n\n\n...and so on",
"### Bare-word Results\n\n\nInterpretation\n--------------\n\n\n* Q6\\_K is an excellent quant choice to make.\n* IQ3\\_XXS is the smallest quant that has very low degradation.\n* Down to Q3\\_K\\_M, the model does not become particularly broken.\n* Q3\\_K\\_S and below, the quants make the model unreliable.\n* K-means quants are usually superior to non-K-means quants"
] |
6d0770e9e5289d0b37a4483cafe87ad9cbae1fac |
Merge of [teknium/OpenHermes-2.5](https://huggingface.co/datasets/teknium/OpenHermes-2.5) and [ise-uiuc/Magicoder-Evol-Instruct-110K](https://huggingface.co/datasets/ise-uiuc/Magicoder-Evol-Instruct-110K) in chatml format for training. Filtered to only have max 2048 tokens. | eastwind/open_hermes_2.5_magicoder_evol_instruct_chatml | [
"language:en",
"region:us"
] | 2024-02-04T16:15:32+00:00 | {"language": ["en"], "pretty_name": "OpenHermes 2.5 + MagiCoder Evol Instruct 110k", "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "tokens", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1881174958.136154, "num_examples": 1103225}], "download_size": 958650488, "dataset_size": 1881174958.136154}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-04T17:09:47+00:00 | [] | [
"en"
] | TAGS
#language-English #region-us
|
Merge of teknium/OpenHermes-2.5 and ise-uiuc/Magicoder-Evol-Instruct-110K in chatml format for training. Filtered to only have max 2048 tokens. | [] | [
"TAGS\n#language-English #region-us \n"
] |
c11b6b1603c055f8c7c529f42cb1cac672da20c4 | # 945 rows of Alpaca
source https://huggingface.co/datasets/tatsu-lab/alpaca | InnerI/945-alpaca | [
"license:cc-by-nc-4.0",
"region:us"
] | 2024-02-04T16:23:53+00:00 | {"license": "cc-by-nc-4.0"} | 2024-02-04T16:26:01+00:00 | [] | [] | TAGS
#license-cc-by-nc-4.0 #region-us
| # 945 rows of Alpaca
source URL | [
"# 945 rows of Alpaca \n\nsource URL"
] | [
"TAGS\n#license-cc-by-nc-4.0 #region-us \n",
"# 945 rows of Alpaca \n\nsource URL"
] |
afdac011e920c4a76e19b4ae6c88b53f757db07b | This work is part of [DODa](https://darija-open-dataset.github.io/).
| imomayiz/darija-english | [
"task_categories:translation",
"language:ar",
"language:en",
"license:cc",
"region:us"
] | 2024-02-04T16:33:09+00:00 | {"language": ["ar", "en"], "license": "cc", "task_categories": ["translation"], "configs": [{"config_name": "sentences", "data_files": [{"split": "sentences", "path": "sentences/sentences.csv"}]}, {"config_name": "submissions", "data_files": [{"split": "submissions", "path": "submissions/submissions*.json"}]}]} | 2024-02-17T07:29:19+00:00 | [] | [
"ar",
"en"
] | TAGS
#task_categories-translation #language-Arabic #language-English #license-cc #region-us
| This work is part of DODa.
| [] | [
"TAGS\n#task_categories-translation #language-Arabic #language-English #license-cc #region-us \n"
] |
92fbb834fcdbd952e9563a1451baaecefb4e25ba | # Dataset Card for Dataset Name
Dataset Summary (Taken from Piqa)
To apply eyeshadow without a brush, should I use a cotton swab or a toothpick? Questions requiring this kind of physical commonsense pose a challenge to state-of-the-art natural language understanding systems. The PIQA dataset introduces the task of physical commonsense reasoning and a corresponding benchmark dataset Physical Interaction: Question Answering or PIQA.
Physical commonsense knowledge is a major challenge on the road to true AI-completeness, including robots that interact with the world and understand natural language.
PIQA focuses on everyday situations with a preference for atypical solutions. The dataset is inspired by instructables.com, which provides users with instructions on how to build, craft, bake, or manipulate objects using everyday materials.
- **Curated by:** Samrat Saha
- **Language(s) (NLP):** ISO 639-2 Code - ben, hin, kan
- **License:** Apache-2.0
### Dataset Sources [optional]
- **Demo [optional]:**
- goal sol1 sol2 label
ಬೆಣ್ಣೆಯನ್ನು ಕುದಿಸುವಾಗ, ಅದು ಸಿದ್ಧವಾದಾಗ, ನೀವು ಮಾ... ಅದನ್ನು ತಟ್ಟೆಯಲ್ಲಿ ಸುರಿಯಿರಿ. ಅದನ್ನು ಬಾಟಲಿಯಲ್ಲಿ ಸುರಿಯಿರಿ. 1
## Dataset Structure
Please Refer to Piqa
## Dataset Creation
This dataset is created from Piqa using 1B High quality Indic Transformer(Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin
).
Currently the Train and validation dataset provided for Bengali, Hindi, Kannada Languages.
The Translation is done using beam search with a beam width of 3.
### Curation Rationale
The goal of the dataset is to convert the Piqa data into Indic Languages for the Development of Indic LLM.
### Source Data
Piqa
### Annotations
Manual Annotation not done, this is completley high quality machine translation dataset.
## Citation
@inproceedings{Bisk2020,
author = {Yonatan Bisk and Rowan Zellers and
Ronan Le Bras and Jianfeng Gao
and Yejin Choi},
title = {PIQA: Reasoning about Physical Commonsense in
Natural Language},
booktitle = {Thirty-Fourth AAAI Conference on
Artificial Intelligence},
year = {2020},
}
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
{
Samrat Saha
[email protected]
}
## Dataset Card Contact
{
author = {Samrat Saha},
title = {PIQA_indic: Reasoning about Physical Commonsense in
Natural Language For Indic Languages},
year = {2024},
} | iitrsamrat/piqa_indic | [
"license:apache-2.0",
"region:us"
] | 2024-02-04T16:55:33+00:00 | {"license": "apache-2.0", "dataset_info": [{"config_name": "ben", "features": [{"name": "goal", "dtype": "string"}, {"name": "sol1", "dtype": "string"}, {"name": "sol2", "dtype": "string"}, {"name": "label", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 10915667, "num_examples": 16113}, {"name": "valid", "num_bytes": 1238392, "num_examples": 1838}], "download_size": 4716439, "dataset_size": 12154059}, {"config_name": "eng", "features": [{"name": "goal", "dtype": "string"}, {"name": "sol1", "dtype": "string"}, {"name": "sol2", "dtype": "string"}, {"name": "label", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 4104002, "num_examples": 16113}, {"name": "valid", "num_bytes": 464309, "num_examples": 1838}], "download_size": 2958845, "dataset_size": 4568311}, {"config_name": "hin", "features": [{"name": "goal", "dtype": "string"}, {"name": "sol1", "dtype": "string"}, {"name": "sol2", "dtype": "string"}, {"name": "label", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 10377270, "num_examples": 16113}, {"name": "valid", "num_bytes": 1170817, "num_examples": 1838}], "download_size": 4597934, "dataset_size": 11548087}, {"config_name": "kan", "features": [{"name": "goal", "dtype": "string"}, {"name": "sol1", "dtype": "string"}, {"name": "sol2", "dtype": "string"}, {"name": "label", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 11890364, "num_examples": 16113}, {"name": "valid", "num_bytes": 1348293, "num_examples": 1838}], "download_size": 4984600, "dataset_size": 13238657}, {"config_name": "tam", "features": [{"name": "goal", "dtype": "string"}, {"name": "sol1", "dtype": "string"}, {"name": "sol2", "dtype": "string"}, {"name": "label", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 12949508, "num_examples": 16113}, {"name": "valid", "num_bytes": 1468796, "num_examples": 1838}], "download_size": 5199760, "dataset_size": 14418304}], "configs": [{"config_name": "ben", "data_files": [{"split": "train", "path": "ben/train-*"}, {"split": "valid", "path": "ben/valid-*"}]}, {"config_name": "eng", "data_files": [{"split": "train", "path": "eng/train-*"}, {"split": "valid", "path": "eng/valid-*"}]}, {"config_name": "hin", "data_files": [{"split": "train", "path": "hin/train-*"}, {"split": "valid", "path": "hin/valid-*"}]}, {"config_name": "kan", "data_files": [{"split": "train", "path": "kan/train-*"}, {"split": "valid", "path": "kan/valid-*"}]}, {"config_name": "tam", "data_files": [{"split": "train", "path": "tam/train-*"}, {"split": "valid", "path": "tam/valid-*"}]}]} | 2024-02-06T12:56:14+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
| # Dataset Card for Dataset Name
Dataset Summary (Taken from Piqa)
To apply eyeshadow without a brush, should I use a cotton swab or a toothpick? Questions requiring this kind of physical commonsense pose a challenge to state-of-the-art natural language understanding systems. The PIQA dataset introduces the task of physical commonsense reasoning and a corresponding benchmark dataset Physical Interaction: Question Answering or PIQA.
Physical commonsense knowledge is a major challenge on the road to true AI-completeness, including robots that interact with the world and understand natural language.
PIQA focuses on everyday situations with a preference for atypical solutions. The dataset is inspired by URL, which provides users with instructions on how to build, craft, bake, or manipulate objects using everyday materials.
- Curated by: Samrat Saha
- Language(s) (NLP): ISO 639-2 Code - ben, hin, kan
- License: Apache-2.0
### Dataset Sources [optional]
- Demo [optional]:
- goal sol1 sol2 label
ಬೆಣ್ಣೆಯನ್ನು ಕುದಿಸುವಾಗ, ಅದು ಸಿದ್ಧವಾದಾಗ, ನೀವು ಮಾ... ಅದನ್ನು ತಟ್ಟೆಯಲ್ಲಿ ಸುರಿಯಿರಿ. ಅದನ್ನು ಬಾಟಲಿಯಲ್ಲಿ ಸುರಿಯಿರಿ. 1
## Dataset Structure
Please Refer to Piqa
## Dataset Creation
This dataset is created from Piqa using 1B High quality Indic Transformer(Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin
).
Currently the Train and validation dataset provided for Bengali, Hindi, Kannada Languages.
The Translation is done using beam search with a beam width of 3.
### Curation Rationale
The goal of the dataset is to convert the Piqa data into Indic Languages for the Development of Indic LLM.
### Source Data
Piqa
### Annotations
Manual Annotation not done, this is completley high quality machine translation dataset.
@inproceedings{Bisk2020,
author = {Yonatan Bisk and Rowan Zellers and
Ronan Le Bras and Jianfeng Gao
and Yejin Choi},
title = {PIQA: Reasoning about Physical Commonsense in
Natural Language},
booktitle = {Thirty-Fourth AAAI Conference on
Artificial Intelligence},
year = {2020},
}
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
{
Samrat Saha
URL@URL
}
## Dataset Card Contact
{
author = {Samrat Saha},
title = {PIQA_indic: Reasoning about Physical Commonsense in
Natural Language For Indic Languages},
year = {2024},
} | [
"# Dataset Card for Dataset Name\n\nDataset Summary (Taken from Piqa)\n\nTo apply eyeshadow without a brush, should I use a cotton swab or a toothpick? Questions requiring this kind of physical commonsense pose a challenge to state-of-the-art natural language understanding systems. The PIQA dataset introduces the task of physical commonsense reasoning and a corresponding benchmark dataset Physical Interaction: Question Answering or PIQA.\n\nPhysical commonsense knowledge is a major challenge on the road to true AI-completeness, including robots that interact with the world and understand natural language.\n\nPIQA focuses on everyday situations with a preference for atypical solutions. The dataset is inspired by URL, which provides users with instructions on how to build, craft, bake, or manipulate objects using everyday materials.\n\n\n- Curated by: Samrat Saha\n- Language(s) (NLP): ISO 639-2 Code - ben, hin, kan\n- License: Apache-2.0",
"### Dataset Sources [optional]\n\n- Demo [optional]:\n- goal\tsol1\tsol2\tlabel\n ಬೆಣ್ಣೆಯನ್ನು ಕುದಿಸುವಾಗ, ಅದು ಸಿದ್ಧವಾದಾಗ, ನೀವು ಮಾ...\tಅದನ್ನು ತಟ್ಟೆಯಲ್ಲಿ ಸುರಿಯಿರಿ.\tಅದನ್ನು ಬಾಟಲಿಯಲ್ಲಿ ಸುರಿಯಿರಿ.\t1",
"## Dataset Structure\n\nPlease Refer to Piqa",
"## Dataset Creation\nThis dataset is created from Piqa using 1B High quality Indic Transformer(Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin\n).\nCurrently the Train and validation dataset provided for Bengali, Hindi, Kannada Languages. \nThe Translation is done using beam search with a beam width of 3.",
"### Curation Rationale\n\nThe goal of the dataset is to convert the Piqa data into Indic Languages for the Development of Indic LLM.",
"### Source Data\nPiqa",
"### Annotations \nManual Annotation not done, this is completley high quality machine translation dataset.\n\n\n@inproceedings{Bisk2020,\n author = {Yonatan Bisk and Rowan Zellers and\n Ronan Le Bras and Jianfeng Gao\n and Yejin Choi},\n title = {PIQA: Reasoning about Physical Commonsense in\n Natural Language},\n booktitle = {Thirty-Fourth AAAI Conference on\n Artificial Intelligence},\n year = {2020},\n}\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]\n{\nSamrat Saha \nURL@URL\n}",
"## Dataset Card Contact\n\n{\n author = {Samrat Saha},\n title = {PIQA_indic: Reasoning about Physical Commonsense in\n Natural Language For Indic Languages},\n year = {2024},\n\n}"
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"# Dataset Card for Dataset Name\n\nDataset Summary (Taken from Piqa)\n\nTo apply eyeshadow without a brush, should I use a cotton swab or a toothpick? Questions requiring this kind of physical commonsense pose a challenge to state-of-the-art natural language understanding systems. The PIQA dataset introduces the task of physical commonsense reasoning and a corresponding benchmark dataset Physical Interaction: Question Answering or PIQA.\n\nPhysical commonsense knowledge is a major challenge on the road to true AI-completeness, including robots that interact with the world and understand natural language.\n\nPIQA focuses on everyday situations with a preference for atypical solutions. The dataset is inspired by URL, which provides users with instructions on how to build, craft, bake, or manipulate objects using everyday materials.\n\n\n- Curated by: Samrat Saha\n- Language(s) (NLP): ISO 639-2 Code - ben, hin, kan\n- License: Apache-2.0",
"### Dataset Sources [optional]\n\n- Demo [optional]:\n- goal\tsol1\tsol2\tlabel\n ಬೆಣ್ಣೆಯನ್ನು ಕುದಿಸುವಾಗ, ಅದು ಸಿದ್ಧವಾದಾಗ, ನೀವು ಮಾ...\tಅದನ್ನು ತಟ್ಟೆಯಲ್ಲಿ ಸುರಿಯಿರಿ.\tಅದನ್ನು ಬಾಟಲಿಯಲ್ಲಿ ಸುರಿಯಿರಿ.\t1",
"## Dataset Structure\n\nPlease Refer to Piqa",
"## Dataset Creation\nThis dataset is created from Piqa using 1B High quality Indic Transformer(Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin\n).\nCurrently the Train and validation dataset provided for Bengali, Hindi, Kannada Languages. \nThe Translation is done using beam search with a beam width of 3.",
"### Curation Rationale\n\nThe goal of the dataset is to convert the Piqa data into Indic Languages for the Development of Indic LLM.",
"### Source Data\nPiqa",
"### Annotations \nManual Annotation not done, this is completley high quality machine translation dataset.\n\n\n@inproceedings{Bisk2020,\n author = {Yonatan Bisk and Rowan Zellers and\n Ronan Le Bras and Jianfeng Gao\n and Yejin Choi},\n title = {PIQA: Reasoning about Physical Commonsense in\n Natural Language},\n booktitle = {Thirty-Fourth AAAI Conference on\n Artificial Intelligence},\n year = {2020},\n}\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]\n{\nSamrat Saha \nURL@URL\n}",
"## Dataset Card Contact\n\n{\n author = {Samrat Saha},\n title = {PIQA_indic: Reasoning about Physical Commonsense in\n Natural Language For Indic Languages},\n year = {2024},\n\n}"
] |
2a37abe2be33c77060997b468bea1bb303638369 | # Code-Golang-QA-2k
This (small) dataset comprises ~1.8k dpo entries related to the Go programming language. It is designed to serve as a resource for individuals looking to enhance machine learning models, create chatbots, or simply to provide a comprehensive knowledge base for developers working with Go.
## Data Format
```json
[
{
"question": "How do you create a new RESTful API endpoint using Gin?",
"chosen_answer": "Creating a new RESTful API endpoint using the Gin framework in Go involves several steps. The Gin framework is a high-performance HTTP web framework that allows you to build web applications and microservices efficiently. It's designed to be faster and more efficient than other frameworks, making it an excellent choice for building RESTful APIs. Here's a step-by-step guide to creating a new RESTful API endpoint using Gin.\n\n### Step 1: Install Gin\n\nFirst, you need to install the Gin package if you haven't already. You can do this by running the following command:\n\n```bash\ngo get -u github.com/gin-gonic/gin\n```\n\nThis command will download and install the Gin package and its dependencies.\n\n### Step 2: Import Gin\n\nIn your Go file where you want to create the API endpoint, import the Gin package at the beginning:\n\n```go\npackage main\n\nimport \"github.com/gin-gonic/gin\"\n```\n\n### Step 3: Create a Gin Router\n\nNext, you create a router instance which will be used to handle the HTTP requests.\n\n```go\nfunc main() {\n router := gin.Default()\n}\n```\n\n### Step 4: Define the Endpoint\n\nNow, define the RESTful API endpoint. Let's say you want to create a simple `GET` endpoint that returns a message. You would use the `GET` method of the router to define the path and the function that will handle requests to that path.\n\n```go\nfunc main() {\n router := gin.Default()\n \n router.GET(\"/message\", func(c *gin.Context) {\n c.JSON(200, gin.H{\n \"message\": \"Hello, World!\",\n })\n })\n}\n```\n\nIn this example, when a client sends a `GET` request to `/message`, the function will return a JSON response with a message.\n\n### Step 5: Run the Server\n\nFinally, you need to start the HTTP server and specify the port it should listen on. This is done using the `Run` method of the router.\n\n```go\nfunc main() {\n router := gin.Default()\n\n router.GET(\"/message\", func(c *gin.Context) {\n c.JSON(200, gin.H{\n \"message\": \"Hello, World!\",\n })\n })\n\n router.Run(\":8080\") // Listen and serve on 0.0.0.0:8080\n}\n```\n\nThis will start the server, and it will listen for requests on port 8080. You can specify a different port by changing the argument to `Run`.\n\n### Testing the Endpoint\n\nYou can test the RESTful API endpoint using a tool like `curl` or Postman. For example, to test the `/message` endpoint with `curl`, you would run:\n\n```bash\ncurl http://localhost:8080/message\n```\n\nThis should return a JSON response like:\n\n```json\n{\"message\": \"Hello, World!\"}\n```\n\n### Conclusion\n\nBy following these steps, you've created a simple RESTful API endpoint using the Gin framework in Go. Gin makes it easy to define and handle routes, making it a powerful tool for building efficient web applications and services. You can expand on this by adding more endpoints, implementing middleware for authentication or logging, and integrating with databases to create full-fledged applications.",
"rejected_answer": "To create a new RESTful API endpoint using Gin, a popular Go web framework, follow these steps:\n\n1. Install Gin\n\nFirst, make sure you have Gin installed. If you haven't installed it yet, run the following command:\n\n```go\ngo get -u github.com/gin-gonic/gin\n```\n\n2. Create a new Go file\n\nCreate a new Go file (for example, `main.go`) to implement your API.\n\n3. Import the Gin package\n\nAt the beginning of your Go file, import the Gin package:\n\n```go\nimport \"github.com/gin-gonic/gin\"\n```\n\n4. Initialize a Gin router\n\nCreate a new Gin router instance:\n\n```go\nr := gin.Default()\n```\n\n5. Create the API endpoint\n\nNow, you can create a new API endpoint using the Gin router instance. Here's an example of creating a GET endpoint:\n\n```go\nr.GET(\"/api\", func(c *gin.Context) {\n // Your API logic goes here\n})\n```\n\nReplace `/api` with the endpoint path you want to create.\n\n6. Add response to the endpoint\n\nAdd the response to your endpoint. Here's an example of returning a JSON response:\n\n```go\nr.GET(\"/api\", func(c *gin.Context) {\n c.JSON(http.StatusOK, gin.H{\n \"message\": \"Hello, World!\",\n })\n})\n```\n\n7. Run the API server\n\nFinally, run your API server:\n\n```go\nr.Run(\":8080\")\n```\n\nReplace `:8080` with the port number you want to use.\n\nThe complete example:\n\n```go\npackage main\n\nimport \"github.com/gin-gonic/gin\"\n\nfunc main() {\n r := gin.Default()\n\n r.GET(\"/api\", func(c *gin.Context) {\n c.JSON(http.StatusOK, gin.H{\n \"message\": \"Hello, World!\",\n })\n })\n\n r.Run(\":8080\")\n}\n```\n\nThis example creates a new RESTful API endpoint at `/api` that returns a JSON response with the message \"Hello, World!\"."
}
...
]
``` | ExAi/Code-Golang-QA-2k-dpo | [
"size_categories:1K<n<10K",
"license:apache-2.0",
"Golang",
"Code",
"Go",
"QA",
"region:us"
] | 2024-02-04T16:56:30+00:00 | {"license": "apache-2.0", "size_categories": ["1K<n<10K"], "tags": ["Golang", "Code", "Go", "QA"]} | 2024-02-04T22:17:58+00:00 | [] | [] | TAGS
#size_categories-1K<n<10K #license-apache-2.0 #Golang #Code #Go #QA #region-us
| # Code-Golang-QA-2k
This (small) dataset comprises ~1.8k dpo entries related to the Go programming language. It is designed to serve as a resource for individuals looking to enhance machine learning models, create chatbots, or simply to provide a comprehensive knowledge base for developers working with Go.
## Data Format
bash\ngo get -u URL main\n\nimport \"URL main() {\n router := gin.Default()\n}\ngo\nfunc main() {\n router := gin.Default()\n \n router.GET(\"/message\", func(c *gin.Context) {\n c.JSON(200, gin.H{\n \"message\": \"Hello, World!\",\n })\n })\n}\ngo\nfunc main() {\n router := gin.Default()\n\n router.GET(\"/message\", func(c *gin.Context) {\n c.JSON(200, gin.H{\n \"message\": \"Hello, World!\",\n })\n })\n\n router.Run(\":8080\") // Listen and serve on 0.0.0.0:8080\n}\nbash\ncurl http://localhost:8080/message\njson\n{\"message\": \"Hello, World!\"}\ngo\ngo get -u URL \"URL := gin.Default()\ngo\nr.GET(\"/api\", func(c *gin.Context) {\n // Your API logic goes here\n})\ngo\nr.GET(\"/api\", func(c *gin.Context) {\n c.JSON(http.StatusOK, gin.H{\n \"message\": \"Hello, World!\",\n })\n})\ngo\nr.Run(\":8080\")\ngo\npackage main\n\nimport \"URL main() {\n r := gin.Default()\n\n r.GET(\"/api\", func(c *gin.Context) {\n c.JSON(http.StatusOK, gin.H{\n \"message\": \"Hello, World!\",\n })\n })\n\n r.Run(\":8080\")\n}\n | [
"# Code-Golang-QA-2k\n\nThis (small) dataset comprises ~1.8k dpo entries related to the Go programming language. It is designed to serve as a resource for individuals looking to enhance machine learning models, create chatbots, or simply to provide a comprehensive knowledge base for developers working with Go.",
"## Data Format\n\nbash\\ngo get -u URL main\\n\\nimport \\\"URL main() {\\n router := gin.Default()\\n}\\ngo\\nfunc main() {\\n router := gin.Default()\\n \\n router.GET(\\\"/message\\\", func(c *gin.Context) {\\n c.JSON(200, gin.H{\\n \\\"message\\\": \\\"Hello, World!\\\",\\n })\\n })\\n}\\ngo\\nfunc main() {\\n router := gin.Default()\\n\\n router.GET(\\\"/message\\\", func(c *gin.Context) {\\n c.JSON(200, gin.H{\\n \\\"message\\\": \\\"Hello, World!\\\",\\n })\\n })\\n\\n router.Run(\\\":8080\\\") // Listen and serve on 0.0.0.0:8080\\n}\\nbash\\ncurl http://localhost:8080/message\\njson\\n{\\\"message\\\": \\\"Hello, World!\\\"}\\ngo\\ngo get -u URL \\\"URL := gin.Default()\\ngo\\nr.GET(\\\"/api\\\", func(c *gin.Context) {\\n // Your API logic goes here\\n})\\ngo\\nr.GET(\\\"/api\\\", func(c *gin.Context) {\\n c.JSON(http.StatusOK, gin.H{\\n \\\"message\\\": \\\"Hello, World!\\\",\\n })\\n})\\ngo\\nr.Run(\\\":8080\\\")\\ngo\\npackage main\\n\\nimport \\\"URL main() {\\n r := gin.Default()\\n\\n r.GET(\\\"/api\\\", func(c *gin.Context) {\\n c.JSON(http.StatusOK, gin.H{\\n \\\"message\\\": \\\"Hello, World!\\\",\\n })\\n })\\n\\n r.Run(\\\":8080\\\")\\n}\\n"
] | [
"TAGS\n#size_categories-1K<n<10K #license-apache-2.0 #Golang #Code #Go #QA #region-us \n",
"# Code-Golang-QA-2k\n\nThis (small) dataset comprises ~1.8k dpo entries related to the Go programming language. It is designed to serve as a resource for individuals looking to enhance machine learning models, create chatbots, or simply to provide a comprehensive knowledge base for developers working with Go.",
"## Data Format\n\nbash\\ngo get -u URL main\\n\\nimport \\\"URL main() {\\n router := gin.Default()\\n}\\ngo\\nfunc main() {\\n router := gin.Default()\\n \\n router.GET(\\\"/message\\\", func(c *gin.Context) {\\n c.JSON(200, gin.H{\\n \\\"message\\\": \\\"Hello, World!\\\",\\n })\\n })\\n}\\ngo\\nfunc main() {\\n router := gin.Default()\\n\\n router.GET(\\\"/message\\\", func(c *gin.Context) {\\n c.JSON(200, gin.H{\\n \\\"message\\\": \\\"Hello, World!\\\",\\n })\\n })\\n\\n router.Run(\\\":8080\\\") // Listen and serve on 0.0.0.0:8080\\n}\\nbash\\ncurl http://localhost:8080/message\\njson\\n{\\\"message\\\": \\\"Hello, World!\\\"}\\ngo\\ngo get -u URL \\\"URL := gin.Default()\\ngo\\nr.GET(\\\"/api\\\", func(c *gin.Context) {\\n // Your API logic goes here\\n})\\ngo\\nr.GET(\\\"/api\\\", func(c *gin.Context) {\\n c.JSON(http.StatusOK, gin.H{\\n \\\"message\\\": \\\"Hello, World!\\\",\\n })\\n})\\ngo\\nr.Run(\\\":8080\\\")\\ngo\\npackage main\\n\\nimport \\\"URL main() {\\n r := gin.Default()\\n\\n r.GET(\\\"/api\\\", func(c *gin.Context) {\\n c.JSON(http.StatusOK, gin.H{\\n \\\"message\\\": \\\"Hello, World!\\\",\\n })\\n })\\n\\n r.Run(\\\":8080\\\")\\n}\\n"
] |
e25a0bb05ff2b5621067fa0cd76a7cd12a9d3e9e |
# Dataset Card for Evaluation run of Sharathhebbar24/code_gpt2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Sharathhebbar24/code_gpt2](https://huggingface.co/Sharathhebbar24/code_gpt2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sharathhebbar24__code_gpt2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T16:58:30.724457](https://huggingface.co/datasets/open-llm-leaderboard/details_Sharathhebbar24__code_gpt2/blob/main/results_2024-02-04T16-58-30.724457.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2497624797144018,
"acc_stderr": 0.03053196950479339,
"acc_norm": 0.2509735053672201,
"acc_norm_stderr": 0.03134563472595933,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.01512742709652068,
"mc2": 0.40599988322796493,
"mc2_stderr": 0.015122708963543154
},
"harness|arc:challenge|25": {
"acc": 0.189419795221843,
"acc_stderr": 0.01145070511591077,
"acc_norm": 0.23293515358361774,
"acc_norm_stderr": 0.012352507042617401
},
"harness|hellaswag|10": {
"acc": 0.2889862577175861,
"acc_stderr": 0.004523651184016271,
"acc_norm": 0.30989842660824535,
"acc_norm_stderr": 0.00461506381774186
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.20394736842105263,
"acc_stderr": 0.03279000406310053,
"acc_norm": 0.20394736842105263,
"acc_norm_stderr": 0.03279000406310053
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2037735849056604,
"acc_stderr": 0.0247907845017754,
"acc_norm": 0.2037735849056604,
"acc_norm_stderr": 0.0247907845017754
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.033450369167889925,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.033450369167889925
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28085106382978725,
"acc_stderr": 0.02937917046412482,
"acc_norm": 0.28085106382978725,
"acc_norm_stderr": 0.02937917046412482
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.03375672449560553,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.03375672449560553
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.14285714285714285,
"acc_stderr": 0.03129843185743808,
"acc_norm": 0.14285714285714285,
"acc_norm_stderr": 0.03129843185743808
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.024685979286239956,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.024685979286239956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.031618563353586086,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.031618563353586086
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03358618145732522,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03358618145732522
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178256,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178256
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2153846153846154,
"acc_stderr": 0.020843034557462878,
"acc_norm": 0.2153846153846154,
"acc_norm_stderr": 0.020843034557462878
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959912,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959912
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.03445406271987054,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.03445406271987054
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3467889908256881,
"acc_stderr": 0.020406097104093027,
"acc_norm": 0.3467889908256881,
"acc_norm_stderr": 0.020406097104093027
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.02792096314799365,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.02792096314799365
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.03019028245350194,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.03019028245350194
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.22869955156950672,
"acc_stderr": 0.02818824004692919,
"acc_norm": 0.22869955156950672,
"acc_norm_stderr": 0.02818824004692919
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952688,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952688
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.02891120880274946,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.02891120880274946
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26181353767560667,
"acc_stderr": 0.01572083867844526,
"acc_norm": 0.26181353767560667,
"acc_norm_stderr": 0.01572083867844526
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.023176298203992005,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.023176298203992005
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.27009646302250806,
"acc_stderr": 0.025218040373410616,
"acc_norm": 0.27009646302250806,
"acc_norm_stderr": 0.025218040373410616
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.23765432098765432,
"acc_stderr": 0.023683591837008553,
"acc_norm": 0.23765432098765432,
"acc_norm_stderr": 0.023683591837008553
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307857,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307857
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24967405475880053,
"acc_stderr": 0.011054538377832318,
"acc_norm": 0.24967405475880053,
"acc_norm_stderr": 0.011054538377832318
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.35661764705882354,
"acc_stderr": 0.02909720956841195,
"acc_norm": 0.35661764705882354,
"acc_norm_stderr": 0.02909720956841195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884603,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884603
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24897959183673468,
"acc_stderr": 0.02768297952296023,
"acc_norm": 0.24897959183673468,
"acc_norm_stderr": 0.02768297952296023
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.031755547866299194,
"acc_norm": 0.21084337349397592,
"acc_norm_stderr": 0.031755547866299194
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.033773102522091945,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.033773102522091945
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.01512742709652068,
"mc2": 0.40599988322796493,
"mc2_stderr": 0.015122708963543154
},
"harness|winogrande|5": {
"acc": 0.4925019731649566,
"acc_stderr": 0.014050905521228584
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Sharathhebbar24__code_gpt2 | [
"region:us"
] | 2024-02-04T16:59:50+00:00 | {"pretty_name": "Evaluation run of Sharathhebbar24/code_gpt2", "dataset_summary": "Dataset automatically created during the evaluation run of model [Sharathhebbar24/code_gpt2](https://huggingface.co/Sharathhebbar24/code_gpt2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sharathhebbar24__code_gpt2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T16:58:30.724457](https://huggingface.co/datasets/open-llm-leaderboard/details_Sharathhebbar24__code_gpt2/blob/main/results_2024-02-04T16-58-30.724457.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2497624797144018,\n \"acc_stderr\": 0.03053196950479339,\n \"acc_norm\": 0.2509735053672201,\n \"acc_norm_stderr\": 0.03134563472595933,\n \"mc1\": 0.2484700122399021,\n \"mc1_stderr\": 0.01512742709652068,\n \"mc2\": 0.40599988322796493,\n \"mc2_stderr\": 0.015122708963543154\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.189419795221843,\n \"acc_stderr\": 0.01145070511591077,\n \"acc_norm\": 0.23293515358361774,\n \"acc_norm_stderr\": 0.012352507042617401\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2889862577175861,\n \"acc_stderr\": 0.004523651184016271,\n \"acc_norm\": 0.30989842660824535,\n \"acc_norm_stderr\": 0.00461506381774186\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.20394736842105263,\n \"acc_stderr\": 0.03279000406310053,\n \"acc_norm\": 0.20394736842105263,\n \"acc_norm_stderr\": 0.03279000406310053\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2037735849056604,\n \"acc_stderr\": 0.0247907845017754,\n \"acc_norm\": 0.2037735849056604,\n \"acc_norm_stderr\": 0.0247907845017754\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n \"acc_stderr\": 0.033450369167889925,\n \"acc_norm\": 0.26011560693641617,\n \"acc_norm_stderr\": 0.033450369167889925\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.28085106382978725,\n \"acc_stderr\": 0.02937917046412482,\n \"acc_norm\": 0.28085106382978725,\n \"acc_norm_stderr\": 0.02937917046412482\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.03375672449560553,\n \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.03375672449560553\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.14285714285714285,\n \"acc_stderr\": 0.03129843185743808,\n \"acc_norm\": 0.14285714285714285,\n \"acc_norm_stderr\": 0.03129843185743808\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n \"acc_stderr\": 0.024685979286239956,\n \"acc_norm\": 0.25161290322580643,\n \"acc_norm_stderr\": 0.024685979286239956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.031618563353586086,\n \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.031618563353586086\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03358618145732522,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03358618145732522\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178256,\n \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178256\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2153846153846154,\n \"acc_stderr\": 0.020843034557462878,\n \"acc_norm\": 0.2153846153846154,\n \"acc_norm_stderr\": 0.020843034557462878\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959912,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959912\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.23178807947019867,\n \"acc_stderr\": 0.03445406271987054,\n \"acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.03445406271987054\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3467889908256881,\n \"acc_stderr\": 0.020406097104093027,\n \"acc_norm\": 0.3467889908256881,\n \"acc_norm_stderr\": 0.020406097104093027\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.02792096314799365,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.02792096314799365\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350194,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350194\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.22869955156950672,\n \"acc_stderr\": 0.02818824004692919,\n \"acc_norm\": 0.22869955156950672,\n \"acc_norm_stderr\": 0.02818824004692919\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n \"acc_stderr\": 0.04059867246952688,\n \"acc_norm\": 0.24107142857142858,\n \"acc_norm_stderr\": 0.04059867246952688\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n \"acc_stderr\": 0.02891120880274946,\n \"acc_norm\": 0.26495726495726496,\n \"acc_norm_stderr\": 0.02891120880274946\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26181353767560667,\n \"acc_stderr\": 0.01572083867844526,\n \"acc_norm\": 0.26181353767560667,\n \"acc_norm_stderr\": 0.01572083867844526\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.023176298203992005,\n \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.023176298203992005\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.27009646302250806,\n \"acc_stderr\": 0.025218040373410616,\n \"acc_norm\": 0.27009646302250806,\n \"acc_norm_stderr\": 0.025218040373410616\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.23765432098765432,\n \"acc_stderr\": 0.023683591837008553,\n \"acc_norm\": 0.23765432098765432,\n \"acc_norm_stderr\": 0.023683591837008553\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307857,\n \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307857\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24967405475880053,\n \"acc_stderr\": 0.011054538377832318,\n \"acc_norm\": 0.24967405475880053,\n \"acc_norm_stderr\": 0.011054538377832318\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.35661764705882354,\n \"acc_stderr\": 0.02909720956841195,\n \"acc_norm\": 0.35661764705882354,\n \"acc_norm_stderr\": 0.02909720956841195\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.03831305140884603,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.03831305140884603\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24897959183673468,\n \"acc_stderr\": 0.02768297952296023,\n \"acc_norm\": 0.24897959183673468,\n \"acc_norm_stderr\": 0.02768297952296023\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n \"acc_stderr\": 0.031755547866299194,\n \"acc_norm\": 0.21084337349397592,\n \"acc_norm_stderr\": 0.031755547866299194\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.033773102522091945,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.033773102522091945\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n \"mc1_stderr\": 0.01512742709652068,\n \"mc2\": 0.40599988322796493,\n \"mc2_stderr\": 0.015122708963543154\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4925019731649566,\n \"acc_stderr\": 0.014050905521228584\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/Sharathhebbar24/code_gpt2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|arc:challenge|25_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|gsm8k|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hellaswag|10_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T16-58-30.724457.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["**/details_harness|winogrande|5_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T16-58-30.724457.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T16_58_30.724457", "path": ["results_2024-02-04T16-58-30.724457.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T16-58-30.724457.parquet"]}]}]} | 2024-02-04T17:00:13+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Sharathhebbar24/code_gpt2
Dataset automatically created during the evaluation run of model Sharathhebbar24/code_gpt2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T16:58:30.724457(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Sharathhebbar24/code_gpt2\n\n\n\nDataset automatically created during the evaluation run of model Sharathhebbar24/code_gpt2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T16:58:30.724457(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Sharathhebbar24/code_gpt2\n\n\n\nDataset automatically created during the evaluation run of model Sharathhebbar24/code_gpt2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T16:58:30.724457(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
c71f3033896b383f72069a4fc365b3c716c4129f |
* [CSGO.zip](CSGO.zip) - 10,397 Samples RGB
* [CSGO_PGM.zip](CSGO_PGM.zip) - 17,440 Samples Grayscale
I collected this dataset of 17,440 samples myself, initially manually and then assisted collection with automation but all images have been manually categorised.
[https://github.com/TFNN/DOCS/tree/main/DATASETS](https://github.com/TFNN/DOCS/tree/main/DATASETS) | tfnn/csgo-player-heads-28x28 | [
"license:mit",
"csgo",
"counter strike",
"cs2",
"counter-strike",
"region:us"
] | 2024-02-04T17:02:49+00:00 | {"license": "mit", "pretty_name": "CSGO Dataset of T&CT heads 28x28", "tags": ["csgo", "counter strike", "cs2", "counter-strike"]} | 2024-02-04T17:32:17+00:00 | [] | [] | TAGS
#license-mit #csgo #counter strike #cs2 #counter-strike #region-us
|
* URL - 10,397 Samples RGB
* CSGO_PGM.zip - 17,440 Samples Grayscale
I collected this dataset of 17,440 samples myself, initially manually and then assisted collection with automation but all images have been manually categorised.
URL | [] | [
"TAGS\n#license-mit #csgo #counter strike #cs2 #counter-strike #region-us \n"
] |
347f622942a65f082b1f3bf8444c1e9f24040b52 |
* [QUAKE3.zip](QUAKE3.zip) - 7,444 Samples RGB
I collected this dataset of 7,444 samples myself, initially manually and then assisted collection with automation but all images have been manually categorised.
[https://github.com/TFNN/DOCS/tree/main/DATASETS](https://github.com/TFNN/DOCS/tree/main/DATASETS)
| tfnn/quake3-aqua-bones-28x28 | [
"license:mit",
"quake3",
"ioquake3",
"quake3arena",
"teamarena",
"region:us"
] | 2024-02-04T17:05:23+00:00 | {"license": "mit", "pretty_name": "Quake 3 Arena Dataset of the Aqua Bones player model 28x28", "tags": ["quake3", "ioquake3", "quake3arena", "teamarena"]} | 2024-02-04T17:32:23+00:00 | [] | [] | TAGS
#license-mit #quake3 #ioquake3 #quake3arena #teamarena #region-us
|
* URL - 7,444 Samples RGB
I collected this dataset of 7,444 samples myself, initially manually and then assisted collection with automation but all images have been manually categorised.
URL
| [] | [
"TAGS\n#license-mit #quake3 #ioquake3 #quake3arena #teamarena #region-us \n"
] |
c3c7cc23ea2051f69d14dfef2044f58af2d1704f | license: unknown
---
| MarkrAI/triviaqa_sample_autorag | [
"region:us"
] | 2024-02-04T17:21:56+00:00 | {"configs": [{"config_name": "qa", "splits": [{"name": "train", "data_files": "qa_train.parquet"}, {"name": "test", "data_files": "qa_test.parquet"}]}, {"config_name": "corpus", "data_files": "corpus.parquet"}]} | 2024-02-05T14:44:08+00:00 | [] | [] | TAGS
#region-us
| license: unknown
---
| [] | [
"TAGS\n#region-us \n"
] |
2949f0e38f46f822eb005b949a48cf56ae6551d3 |
# Dataset Card for Evaluation run of Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test](https://huggingface.co/Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.3B-Cinder-Reason-Test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T17:23:23.398012](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.3B-Cinder-Reason-Test/blob/main/results_2024-02-04T17-23-23.398012.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2709585369031208,
"acc_stderr": 0.03132168435923893,
"acc_norm": 0.2720079653463839,
"acc_norm_stderr": 0.032088283521883774,
"mc1": 0.2178702570379437,
"mc1_stderr": 0.014450846714123899,
"mc2": 0.3558941030383168,
"mc2_stderr": 0.013905917466770197
},
"harness|arc:challenge|25": {
"acc": 0.29436860068259385,
"acc_stderr": 0.013318528460539419,
"acc_norm": 0.3250853242320819,
"acc_norm_stderr": 0.013688147309729124
},
"harness|hellaswag|10": {
"acc": 0.4245170284803824,
"acc_stderr": 0.004932593348813624,
"acc_norm": 0.5584544911372237,
"acc_norm_stderr": 0.004955564650016174
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.03749850709174021,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.03749850709174021
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2236842105263158,
"acc_stderr": 0.03391160934343604,
"acc_norm": 0.2236842105263158,
"acc_norm_stderr": 0.03391160934343604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27547169811320754,
"acc_stderr": 0.027495663683724067,
"acc_norm": 0.27547169811320754,
"acc_norm_stderr": 0.027495663683724067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.037161774375660164,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.037161774375660164
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.030631145539198823,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.030631145539198823
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.02924188386962881,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.02924188386962881
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.0409698513984367,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.0409698513984367
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924811,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924811
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708617,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708617
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276862,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276862
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.27741935483870966,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.27741935483870966,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23232323232323232,
"acc_stderr": 0.030088629490217483,
"acc_norm": 0.23232323232323232,
"acc_norm_stderr": 0.030088629490217483
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.29533678756476683,
"acc_stderr": 0.03292296639155139,
"acc_norm": 0.29533678756476683,
"acc_norm_stderr": 0.03292296639155139
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.31025641025641026,
"acc_stderr": 0.023454674889404288,
"acc_norm": 0.31025641025641026,
"acc_norm_stderr": 0.023454674889404288
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275798,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275798
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.029597329730978086,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.029597329730978086
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23302752293577983,
"acc_stderr": 0.0181256691808615,
"acc_norm": 0.23302752293577983,
"acc_norm_stderr": 0.0181256691808615
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.03132179803083289,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.03132179803083289
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2869198312236287,
"acc_stderr": 0.02944377302259469,
"acc_norm": 0.2869198312236287,
"acc_norm_stderr": 0.02944377302259469
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3811659192825112,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.3811659192825112,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.04236511258094635,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.04236511258094635
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03894641120044792,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03894641120044792
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.04453254836326469,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.04453254836326469
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2094017094017094,
"acc_stderr": 0.026655699653922747,
"acc_norm": 0.2094017094017094,
"acc_norm_stderr": 0.026655699653922747
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2681992337164751,
"acc_stderr": 0.01584243083526944,
"acc_norm": 0.2681992337164751,
"acc_norm_stderr": 0.01584243083526944
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22832369942196531,
"acc_stderr": 0.022598703804321628,
"acc_norm": 0.22832369942196531,
"acc_norm_stderr": 0.022598703804321628
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25251396648044694,
"acc_stderr": 0.014530330201468648,
"acc_norm": 0.25251396648044694,
"acc_norm_stderr": 0.014530330201468648
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.238562091503268,
"acc_stderr": 0.024404394928087866,
"acc_norm": 0.238562091503268,
"acc_norm_stderr": 0.024404394928087866
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2829581993569132,
"acc_stderr": 0.025583062489984838,
"acc_norm": 0.2829581993569132,
"acc_norm_stderr": 0.025583062489984838
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2623456790123457,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.2623456790123457,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2127659574468085,
"acc_stderr": 0.024414612974307706,
"acc_norm": 0.2127659574468085,
"acc_norm_stderr": 0.024414612974307706
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23142112125162972,
"acc_stderr": 0.01077146171157646,
"acc_norm": 0.23142112125162972,
"acc_norm_stderr": 0.01077146171157646
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4007352941176471,
"acc_stderr": 0.029768263528933105,
"acc_norm": 0.4007352941176471,
"acc_norm_stderr": 0.029768263528933105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.017322789207784326,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.017322789207784326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.04461272175910508,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.04461272175910508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02892058322067561,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02892058322067561
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22885572139303484,
"acc_stderr": 0.029705284056772426,
"acc_norm": 0.22885572139303484,
"acc_norm_stderr": 0.029705284056772426
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3132530120481928,
"acc_stderr": 0.03610805018031023,
"acc_norm": 0.3132530120481928,
"acc_norm_stderr": 0.03610805018031023
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2178702570379437,
"mc1_stderr": 0.014450846714123899,
"mc2": 0.3558941030383168,
"mc2_stderr": 0.013905917466770197
},
"harness|winogrande|5": {
"acc": 0.6211523283346487,
"acc_stderr": 0.013633724603180318
},
"harness|gsm8k|5": {
"acc": 0.02350265352539803,
"acc_stderr": 0.004172883669643982
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.3B-Cinder-Reason-Test | [
"region:us"
] | 2024-02-04T17:25:48+00:00 | {"pretty_name": "Evaluation run of Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test", "dataset_summary": "Dataset automatically created during the evaluation run of model [Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test](https://huggingface.co/Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.3B-Cinder-Reason-Test\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T17:23:23.398012](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.3B-Cinder-Reason-Test/blob/main/results_2024-02-04T17-23-23.398012.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2709585369031208,\n \"acc_stderr\": 0.03132168435923893,\n \"acc_norm\": 0.2720079653463839,\n \"acc_norm_stderr\": 0.032088283521883774,\n \"mc1\": 0.2178702570379437,\n \"mc1_stderr\": 0.014450846714123899,\n \"mc2\": 0.3558941030383168,\n \"mc2_stderr\": 0.013905917466770197\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.29436860068259385,\n \"acc_stderr\": 0.013318528460539419,\n \"acc_norm\": 0.3250853242320819,\n \"acc_norm_stderr\": 0.013688147309729124\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4245170284803824,\n \"acc_stderr\": 0.004932593348813624,\n \"acc_norm\": 0.5584544911372237,\n \"acc_norm_stderr\": 0.004955564650016174\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.03749850709174021,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.03749850709174021\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.03391160934343604,\n \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.03391160934343604\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.27547169811320754,\n \"acc_stderr\": 0.027495663683724067,\n \"acc_norm\": 0.27547169811320754,\n \"acc_norm_stderr\": 0.027495663683724067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n \"acc_stderr\": 0.037161774375660164,\n \"acc_norm\": 0.2708333333333333,\n \"acc_norm_stderr\": 0.037161774375660164\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n \"acc_stderr\": 0.030631145539198823,\n \"acc_norm\": 0.2023121387283237,\n \"acc_norm_stderr\": 0.030631145539198823\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.02924188386962881,\n \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.02924188386962881\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.0409698513984367,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.0409698513984367\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924811,\n \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924811\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n \"acc_stderr\": 0.03567016675276862,\n \"acc_norm\": 0.1984126984126984,\n \"acc_norm_stderr\": 0.03567016675276862\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.27741935483870966,\n \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.27741935483870966,\n \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.23232323232323232,\n \"acc_stderr\": 0.030088629490217483,\n \"acc_norm\": 0.23232323232323232,\n \"acc_norm_stderr\": 0.030088629490217483\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.29533678756476683,\n \"acc_stderr\": 0.03292296639155139,\n \"acc_norm\": 0.29533678756476683,\n \"acc_norm_stderr\": 0.03292296639155139\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.31025641025641026,\n \"acc_stderr\": 0.023454674889404288,\n \"acc_norm\": 0.31025641025641026,\n \"acc_norm_stderr\": 0.023454674889404288\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275798,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275798\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.029597329730978086,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.029597329730978086\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23302752293577983,\n \"acc_stderr\": 0.0181256691808615,\n \"acc_norm\": 0.23302752293577983,\n \"acc_norm_stderr\": 0.0181256691808615\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.03132179803083289,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.03132179803083289\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2869198312236287,\n \"acc_stderr\": 0.02944377302259469,\n \"acc_norm\": 0.2869198312236287,\n \"acc_norm_stderr\": 0.02944377302259469\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3811659192825112,\n \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.3811659192825112,\n \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.04236511258094635,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.04236511258094635\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615624,\n \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615624\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.03894641120044792,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.03894641120044792\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.04453254836326469,\n \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.04453254836326469\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2094017094017094,\n \"acc_stderr\": 0.026655699653922747,\n \"acc_norm\": 0.2094017094017094,\n \"acc_norm_stderr\": 0.026655699653922747\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2681992337164751,\n \"acc_stderr\": 0.01584243083526944,\n \"acc_norm\": 0.2681992337164751,\n \"acc_norm_stderr\": 0.01584243083526944\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.22832369942196531,\n \"acc_stderr\": 0.022598703804321628,\n \"acc_norm\": 0.22832369942196531,\n \"acc_norm_stderr\": 0.022598703804321628\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25251396648044694,\n \"acc_stderr\": 0.014530330201468648,\n \"acc_norm\": 0.25251396648044694,\n \"acc_norm_stderr\": 0.014530330201468648\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.238562091503268,\n \"acc_stderr\": 0.024404394928087866,\n \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.024404394928087866\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2829581993569132,\n \"acc_stderr\": 0.025583062489984838,\n \"acc_norm\": 0.2829581993569132,\n \"acc_norm_stderr\": 0.025583062489984838\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.024477222856135114,\n \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.024477222856135114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2127659574468085,\n \"acc_stderr\": 0.024414612974307706,\n \"acc_norm\": 0.2127659574468085,\n \"acc_norm_stderr\": 0.024414612974307706\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23142112125162972,\n \"acc_stderr\": 0.01077146171157646,\n \"acc_norm\": 0.23142112125162972,\n \"acc_norm_stderr\": 0.01077146171157646\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4007352941176471,\n \"acc_stderr\": 0.029768263528933105,\n \"acc_norm\": 0.4007352941176471,\n \"acc_norm_stderr\": 0.029768263528933105\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.017322789207784326,\n \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.017322789207784326\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n \"acc_stderr\": 0.04461272175910508,\n \"acc_norm\": 0.3181818181818182,\n \"acc_norm_stderr\": 0.04461272175910508\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.02892058322067561,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02892058322067561\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n \"acc_stderr\": 0.029705284056772426,\n \"acc_norm\": 0.22885572139303484,\n \"acc_norm_stderr\": 0.029705284056772426\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n \"acc_stderr\": 0.03610805018031023,\n \"acc_norm\": 0.3132530120481928,\n \"acc_norm_stderr\": 0.03610805018031023\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2178702570379437,\n \"mc1_stderr\": 0.014450846714123899,\n \"mc2\": 0.3558941030383168,\n \"mc2_stderr\": 0.013905917466770197\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6211523283346487,\n \"acc_stderr\": 0.013633724603180318\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02350265352539803,\n \"acc_stderr\": 0.004172883669643982\n }\n}\n```", "repo_url": "https://huggingface.co/Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|arc:challenge|25_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|gsm8k|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hellaswag|10_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T17-23-23.398012.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["**/details_harness|winogrande|5_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T17-23-23.398012.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T17_23_23.398012", "path": ["results_2024-02-04T17-23-23.398012.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T17-23-23.398012.parquet"]}]}]} | 2024-02-04T17:26:09+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test
Dataset automatically created during the evaluation run of model Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T17:23:23.398012(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test\n\n\n\nDataset automatically created during the evaluation run of model Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T17:23:23.398012(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test\n\n\n\nDataset automatically created during the evaluation run of model Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T17:23:23.398012(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7760ba3f556b8baea8a61e369d5d5df6b5972a33 |
Help me, it is not working 😭
What is the syntax for using imatrix command?? It has to be .imatrix file? | Ji-Ha/deepseek-coder-7b-base-imatrix-wikitext | [
"license:apache-2.0",
"region:us"
] | 2024-02-04T17:30:16+00:00 | {"license": "apache-2.0"} | 2024-02-04T17:35:14+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
Help me, it is not working
What is the syntax for using imatrix command?? It has to be .imatrix file? | [] | [
"TAGS\n#license-apache-2.0 #region-us \n"
] |
4dc62052cac9d3bfa3ff5db6e6abf112f5482e60 |
# Dataset Card for Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-corrupted
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-corrupted](https://huggingface.co/sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-corrupted) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO-corrupted",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T17:31:09.507616](https://huggingface.co/datasets/open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO-corrupted/blob/main/results_2024-02-04T17-31-09.507616.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.63871240933844,
"acc_stderr": 0.03227195231132849,
"acc_norm": 0.6412014980547024,
"acc_norm_stderr": 0.0329135684023751,
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502022,
"mc2": 0.5283606118838293,
"mc2_stderr": 0.015281911992166169
},
"harness|arc:challenge|25": {
"acc": 0.6117747440273038,
"acc_stderr": 0.014241614207414051,
"acc_norm": 0.6527303754266212,
"acc_norm_stderr": 0.013913034529620453
},
"harness|hellaswag|10": {
"acc": 0.6560446126269668,
"acc_stderr": 0.004740555782142168,
"acc_norm": 0.8458474407488548,
"acc_norm_stderr": 0.0036035695286784127
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.02854479331905533,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.02854479331905533
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.02302589961718871,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.02302589961718871
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187208,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187208
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707781,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707781
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.0133878957315436,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.0133878957315436
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3128491620111732,
"acc_stderr": 0.015506892594647263,
"acc_norm": 0.3128491620111732,
"acc_norm_stderr": 0.015506892594647263
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464485,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464485
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462937,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462937
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162662,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162662
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.027979823538744543,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.027979823538744543
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.027686913588013024,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.027686913588013024
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502022,
"mc2": 0.5283606118838293,
"mc2_stderr": 0.015281911992166169
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
},
"harness|gsm8k|5": {
"acc": 0.5807429871114481,
"acc_stderr": 0.013591720959042113
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO-corrupted | [
"region:us"
] | 2024-02-04T17:33:31+00:00 | {"pretty_name": "Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-corrupted", "dataset_summary": "Dataset automatically created during the evaluation run of model [sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-corrupted](https://huggingface.co/sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-corrupted) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO-corrupted\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T17:31:09.507616](https://huggingface.co/datasets/open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO-corrupted/blob/main/results_2024-02-04T17-31-09.507616.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.63871240933844,\n \"acc_stderr\": 0.03227195231132849,\n \"acc_norm\": 0.6412014980547024,\n \"acc_norm_stderr\": 0.0329135684023751,\n \"mc1\": 0.35862913096695226,\n \"mc1_stderr\": 0.016789289499502022,\n \"mc2\": 0.5283606118838293,\n \"mc2_stderr\": 0.015281911992166169\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6117747440273038,\n \"acc_stderr\": 0.014241614207414051,\n \"acc_norm\": 0.6527303754266212,\n \"acc_norm_stderr\": 0.013913034529620453\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6560446126269668,\n \"acc_stderr\": 0.004740555782142168,\n \"acc_norm\": 0.8458474407488548,\n \"acc_norm_stderr\": 0.0036035695286784127\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.02854479331905533,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.02854479331905533\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.02302589961718871,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.02302589961718871\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187208,\n \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187208\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.0133878957315436,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.0133878957315436\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3128491620111732,\n \"acc_stderr\": 0.015506892594647263,\n \"acc_norm\": 0.3128491620111732,\n \"acc_norm_stderr\": 0.015506892594647263\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464485,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464485\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462937,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462937\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162662,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162662\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.027979823538744543,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.027979823538744543\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.027686913588013024,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.027686913588013024\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35862913096695226,\n \"mc1_stderr\": 0.016789289499502022,\n \"mc2\": 0.5283606118838293,\n \"mc2_stderr\": 0.015281911992166169\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5807429871114481,\n \"acc_stderr\": 0.013591720959042113\n }\n}\n```", "repo_url": "https://huggingface.co/sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-corrupted", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|arc:challenge|25_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|gsm8k|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hellaswag|10_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T17-31-09.507616.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["**/details_harness|winogrande|5_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T17-31-09.507616.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T17_31_09.507616", "path": ["results_2024-02-04T17-31-09.507616.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T17-31-09.507616.parquet"]}]}]} | 2024-02-04T17:33:58+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-corrupted
Dataset automatically created during the evaluation run of model sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-corrupted on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T17:31:09.507616(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-corrupted\n\n\n\nDataset automatically created during the evaluation run of model sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-corrupted on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T17:31:09.507616(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-corrupted\n\n\n\nDataset automatically created during the evaluation run of model sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-corrupted on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T17:31:09.507616(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
b24ed48322dd725e3e8f0da79401a6a82356abf9 |
# Dataset Card for Evaluation run of kevin009/llamaRAGdrama
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kevin009/llamaRAGdrama](https://huggingface.co/kevin009/llamaRAGdrama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kevin009__llamaRAGdrama",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T17:32:11.811282](https://huggingface.co/datasets/open-llm-leaderboard/details_kevin009__llamaRAGdrama/blob/main/results_2024-02-04T17-32-11.811282.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6509951836227733,
"acc_stderr": 0.03219813572713421,
"acc_norm": 0.6503747306678803,
"acc_norm_stderr": 0.03287917115292355,
"mc1": 0.5593635250917993,
"mc1_stderr": 0.017379697555437446,
"mc2": 0.7023726347025372,
"mc2_stderr": 0.015127172867320252
},
"harness|arc:challenge|25": {
"acc": 0.7013651877133106,
"acc_stderr": 0.013374078615068747,
"acc_norm": 0.7201365187713311,
"acc_norm_stderr": 0.01311904089772592
},
"harness|hellaswag|10": {
"acc": 0.7224656442939653,
"acc_stderr": 0.004468672138910926,
"acc_norm": 0.8882692690699064,
"acc_norm_stderr": 0.0031439103617792574
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.02550648169813821,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.02550648169813821
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.0274796030105388,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.0274796030105388
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948475,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948475
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.01591955782997604,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.01591955782997604
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.03322015795776741,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.03322015795776741
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.013853724170922526,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.013853724170922526
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4301675977653631,
"acc_stderr": 0.016558601636041035,
"acc_norm": 0.4301675977653631,
"acc_norm_stderr": 0.016558601636041035
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4641460234680574,
"acc_stderr": 0.01273736131873058,
"acc_norm": 0.4641460234680574,
"acc_norm_stderr": 0.01273736131873058
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5593635250917993,
"mc1_stderr": 0.017379697555437446,
"mc2": 0.7023726347025372,
"mc2_stderr": 0.015127172867320252
},
"harness|winogrande|5": {
"acc": 0.8666140489344909,
"acc_stderr": 0.00955544802642297
},
"harness|gsm8k|5": {
"acc": 0.6565579984836998,
"acc_stderr": 0.013079933811800308
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_kevin009__llamaRAGdrama | [
"region:us"
] | 2024-02-04T17:34:34+00:00 | {"pretty_name": "Evaluation run of kevin009/llamaRAGdrama", "dataset_summary": "Dataset automatically created during the evaluation run of model [kevin009/llamaRAGdrama](https://huggingface.co/kevin009/llamaRAGdrama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kevin009__llamaRAGdrama\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T17:32:11.811282](https://huggingface.co/datasets/open-llm-leaderboard/details_kevin009__llamaRAGdrama/blob/main/results_2024-02-04T17-32-11.811282.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6509951836227733,\n \"acc_stderr\": 0.03219813572713421,\n \"acc_norm\": 0.6503747306678803,\n \"acc_norm_stderr\": 0.03287917115292355,\n \"mc1\": 0.5593635250917993,\n \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.7023726347025372,\n \"mc2_stderr\": 0.015127172867320252\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7013651877133106,\n \"acc_stderr\": 0.013374078615068747,\n \"acc_norm\": 0.7201365187713311,\n \"acc_norm_stderr\": 0.01311904089772592\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7224656442939653,\n \"acc_stderr\": 0.004468672138910926,\n \"acc_norm\": 0.8882692690699064,\n \"acc_norm_stderr\": 0.0031439103617792574\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.02550648169813821,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.02550648169813821\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.0274796030105388,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.0274796030105388\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948475,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948475\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.01591955782997604,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.01591955782997604\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.03322015795776741,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.03322015795776741\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n \"acc_stderr\": 0.013853724170922526,\n \"acc_norm\": 0.8160919540229885,\n \"acc_norm_stderr\": 0.013853724170922526\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n \"acc_stderr\": 0.016558601636041035,\n \"acc_norm\": 0.4301675977653631,\n \"acc_norm_stderr\": 0.016558601636041035\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4641460234680574,\n \"acc_stderr\": 0.01273736131873058,\n \"acc_norm\": 0.4641460234680574,\n \"acc_norm_stderr\": 0.01273736131873058\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5593635250917993,\n \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.7023726347025372,\n \"mc2_stderr\": 0.015127172867320252\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8666140489344909,\n \"acc_stderr\": 0.00955544802642297\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6565579984836998,\n \"acc_stderr\": 0.013079933811800308\n }\n}\n```", "repo_url": "https://huggingface.co/kevin009/llamaRAGdrama", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|arc:challenge|25_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|gsm8k|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hellaswag|10_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T17-32-11.811282.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["**/details_harness|winogrande|5_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T17-32-11.811282.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T17_32_11.811282", "path": ["results_2024-02-04T17-32-11.811282.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T17-32-11.811282.parquet"]}]}]} | 2024-02-04T17:34:57+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of kevin009/llamaRAGdrama
Dataset automatically created during the evaluation run of model kevin009/llamaRAGdrama on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T17:32:11.811282(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of kevin009/llamaRAGdrama\n\n\n\nDataset automatically created during the evaluation run of model kevin009/llamaRAGdrama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T17:32:11.811282(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kevin009/llamaRAGdrama\n\n\n\nDataset automatically created during the evaluation run of model kevin009/llamaRAGdrama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T17:32:11.811282(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
ea8b729ef4b642433b86e5f064e73fdb3786ebbc |
# Dataset Card for Evaluation run of FelixChao/Faraday-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/Faraday-7B](https://huggingface.co/FelixChao/Faraday-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__Faraday-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-05T08:14:19.602946](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Faraday-7B/blob/main/results_2024-02-05T08-14-19.602946.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6527202304308277,
"acc_stderr": 0.032050575781462455,
"acc_norm": 0.6522706009646173,
"acc_norm_stderr": 0.032721311898657485,
"mc1": 0.5740514075887393,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.730666625087402,
"mc2_stderr": 0.01464492425880792
},
"harness|arc:challenge|25": {
"acc": 0.7013651877133106,
"acc_stderr": 0.013374078615068744,
"acc_norm": 0.7226962457337884,
"acc_norm_stderr": 0.013082095839059376
},
"harness|hellaswag|10": {
"acc": 0.7165903206532563,
"acc_stderr": 0.004497325533959638,
"acc_norm": 0.8889663413662617,
"acc_norm_stderr": 0.0031353173122281247
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.02845015479411864,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.02845015479411864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.02552503438247489,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.02552503438247489
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926924,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926924
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993466,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993466
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41899441340782123,
"acc_stderr": 0.016501579306861677,
"acc_norm": 0.41899441340782123,
"acc_norm_stderr": 0.016501579306861677
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042103,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4745762711864407,
"acc_stderr": 0.012753716929101004,
"acc_norm": 0.4745762711864407,
"acc_norm_stderr": 0.012753716929101004
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507208,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507208
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160896,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160896
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5740514075887393,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.730666625087402,
"mc2_stderr": 0.01464492425880792
},
"harness|winogrande|5": {
"acc": 0.8531965272296764,
"acc_stderr": 0.009946627440250676
},
"harness|gsm8k|5": {
"acc": 0.6724791508718726,
"acc_stderr": 0.012927102210426722
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_FelixChao__Faraday-7B | [
"region:us"
] | 2024-02-04T17:36:00+00:00 | {"pretty_name": "Evaluation run of FelixChao/Faraday-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [FelixChao/Faraday-7B](https://huggingface.co/FelixChao/Faraday-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__Faraday-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-05T08:14:19.602946](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Faraday-7B/blob/main/results_2024-02-05T08-14-19.602946.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6527202304308277,\n \"acc_stderr\": 0.032050575781462455,\n \"acc_norm\": 0.6522706009646173,\n \"acc_norm_stderr\": 0.032721311898657485,\n \"mc1\": 0.5740514075887393,\n \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.730666625087402,\n \"mc2_stderr\": 0.01464492425880792\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7013651877133106,\n \"acc_stderr\": 0.013374078615068744,\n \"acc_norm\": 0.7226962457337884,\n \"acc_norm_stderr\": 0.013082095839059376\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7165903206532563,\n \"acc_stderr\": 0.004497325533959638,\n \"acc_norm\": 0.8889663413662617,\n \"acc_norm_stderr\": 0.0031353173122281247\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.02845015479411864,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.02845015479411864\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43386243386243384,\n \"acc_stderr\": 0.02552503438247489,\n \"acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.02552503438247489\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926924,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926924\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993466,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993466\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41899441340782123,\n \"acc_stderr\": 0.016501579306861677,\n \"acc_norm\": 0.41899441340782123,\n \"acc_norm_stderr\": 0.016501579306861677\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042103,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042103\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4745762711864407,\n \"acc_stderr\": 0.012753716929101004,\n \"acc_norm\": 0.4745762711864407,\n \"acc_norm_stderr\": 0.012753716929101004\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160896,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160896\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5740514075887393,\n \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.730666625087402,\n \"mc2_stderr\": 0.01464492425880792\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8531965272296764,\n \"acc_stderr\": 0.009946627440250676\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6724791508718726,\n \"acc_stderr\": 0.012927102210426722\n }\n}\n```", "repo_url": "https://huggingface.co/FelixChao/Faraday-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|arc:challenge|25_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|arc:challenge|25_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|gsm8k|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|gsm8k|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hellaswag|10_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hellaswag|10_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T17-33-41.720010.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T08-14-19.602946.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["**/details_harness|winogrande|5_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["**/details_harness|winogrande|5_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-05T08-14-19.602946.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T17_33_41.720010", "path": ["results_2024-02-04T17-33-41.720010.parquet"]}, {"split": "2024_02_05T08_14_19.602946", "path": ["results_2024-02-05T08-14-19.602946.parquet"]}, {"split": "latest", "path": ["results_2024-02-05T08-14-19.602946.parquet"]}]}]} | 2024-02-05T08:17:02+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of FelixChao/Faraday-7B
Dataset automatically created during the evaluation run of model FelixChao/Faraday-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-05T08:14:19.602946(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of FelixChao/Faraday-7B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/Faraday-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T08:14:19.602946(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of FelixChao/Faraday-7B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/Faraday-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T08:14:19.602946(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
48cd89e2fe91e82288ee91778f7db4a6f7fe82a8 |
# Dataset Card for Evaluation run of gmonsoon/MiaAffogato-Indo-Mistral-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gmonsoon/MiaAffogato-Indo-Mistral-7b](https://huggingface.co/gmonsoon/MiaAffogato-Indo-Mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gmonsoon__MiaAffogato-Indo-Mistral-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T17:34:06.201391](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__MiaAffogato-Indo-Mistral-7b/blob/main/results_2024-02-04T17-34-06.201391.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.644912685809928,
"acc_stderr": 0.03209671177294616,
"acc_norm": 0.6450526633312276,
"acc_norm_stderr": 0.032762897968506996,
"mc1": 0.4186046511627907,
"mc1_stderr": 0.017270015284476855,
"mc2": 0.5818075900070901,
"mc2_stderr": 0.015361574502931194
},
"harness|arc:challenge|25": {
"acc": 0.6322525597269625,
"acc_stderr": 0.014090995618168477,
"acc_norm": 0.6638225255972696,
"acc_norm_stderr": 0.013804855026205761
},
"harness|hellaswag|10": {
"acc": 0.6585341565425215,
"acc_stderr": 0.004732322172153749,
"acc_norm": 0.8543118900617407,
"acc_norm_stderr": 0.0035207225053320934
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047423976,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047423976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224469,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224469
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997692,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997692
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.02805779167298902,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.02805779167298902
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815632,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815632
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135353,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135353
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.036959801280988226,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.036959801280988226
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092382,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092382
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.013547415658662255,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.013547415658662255
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36983240223463687,
"acc_stderr": 0.016145881256056215,
"acc_norm": 0.36983240223463687,
"acc_norm_stderr": 0.016145881256056215
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.02573885479781874,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.02573885479781874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4517601043024772,
"acc_stderr": 0.012710662233660247,
"acc_norm": 0.4517601043024772,
"acc_norm_stderr": 0.012710662233660247
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.01933314202079716,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.01933314202079716
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.028920583220675596,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.028920583220675596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233257,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233257
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835816,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835816
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4186046511627907,
"mc1_stderr": 0.017270015284476855,
"mc2": 0.5818075900070901,
"mc2_stderr": 0.015361574502931194
},
"harness|winogrande|5": {
"acc": 0.8318863456985004,
"acc_stderr": 0.010510336954166737
},
"harness|gsm8k|5": {
"acc": 0.6770280515542078,
"acc_stderr": 0.012880360794851815
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_gmonsoon__MiaAffogato-Indo-Mistral-7b | [
"region:us"
] | 2024-02-04T17:36:25+00:00 | {"pretty_name": "Evaluation run of gmonsoon/MiaAffogato-Indo-Mistral-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [gmonsoon/MiaAffogato-Indo-Mistral-7b](https://huggingface.co/gmonsoon/MiaAffogato-Indo-Mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gmonsoon__MiaAffogato-Indo-Mistral-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T17:34:06.201391](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__MiaAffogato-Indo-Mistral-7b/blob/main/results_2024-02-04T17-34-06.201391.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.644912685809928,\n \"acc_stderr\": 0.03209671177294616,\n \"acc_norm\": 0.6450526633312276,\n \"acc_norm_stderr\": 0.032762897968506996,\n \"mc1\": 0.4186046511627907,\n \"mc1_stderr\": 0.017270015284476855,\n \"mc2\": 0.5818075900070901,\n \"mc2_stderr\": 0.015361574502931194\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6322525597269625,\n \"acc_stderr\": 0.014090995618168477,\n \"acc_norm\": 0.6638225255972696,\n \"acc_norm_stderr\": 0.013804855026205761\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6585341565425215,\n \"acc_stderr\": 0.004732322172153749,\n \"acc_norm\": 0.8543118900617407,\n \"acc_norm_stderr\": 0.0035207225053320934\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224469,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224469\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.02805779167298902,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.02805779167298902\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815632,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815632\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135353,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135353\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.036959801280988226,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.036959801280988226\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092382,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092382\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.013547415658662255,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.013547415658662255\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36983240223463687,\n \"acc_stderr\": 0.016145881256056215,\n \"acc_norm\": 0.36983240223463687,\n \"acc_norm_stderr\": 0.016145881256056215\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.02573885479781874,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.02573885479781874\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4517601043024772,\n \"acc_stderr\": 0.012710662233660247,\n \"acc_norm\": 0.4517601043024772,\n \"acc_norm_stderr\": 0.012710662233660247\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.01933314202079716,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.01933314202079716\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675596,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233257,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233257\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835816,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835816\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4186046511627907,\n \"mc1_stderr\": 0.017270015284476855,\n \"mc2\": 0.5818075900070901,\n \"mc2_stderr\": 0.015361574502931194\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8318863456985004,\n \"acc_stderr\": 0.010510336954166737\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6770280515542078,\n \"acc_stderr\": 0.012880360794851815\n }\n}\n```", "repo_url": "https://huggingface.co/gmonsoon/MiaAffogato-Indo-Mistral-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|arc:challenge|25_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|gsm8k|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hellaswag|10_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T17-34-06.201391.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["**/details_harness|winogrande|5_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T17-34-06.201391.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T17_34_06.201391", "path": ["results_2024-02-04T17-34-06.201391.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T17-34-06.201391.parquet"]}]}]} | 2024-02-04T17:36:47+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of gmonsoon/MiaAffogato-Indo-Mistral-7b
Dataset automatically created during the evaluation run of model gmonsoon/MiaAffogato-Indo-Mistral-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T17:34:06.201391(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of gmonsoon/MiaAffogato-Indo-Mistral-7b\n\n\n\nDataset automatically created during the evaluation run of model gmonsoon/MiaAffogato-Indo-Mistral-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T17:34:06.201391(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of gmonsoon/MiaAffogato-Indo-Mistral-7b\n\n\n\nDataset automatically created during the evaluation run of model gmonsoon/MiaAffogato-Indo-Mistral-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T17:34:06.201391(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
3570d92d76ff58fe4a471df80f8a4fe257f2bff7 |
# Dataset Card for Evaluation run of gmonsoon/OpenMia-Indo-Mistral-7b-v3-refined
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gmonsoon/OpenMia-Indo-Mistral-7b-v3-refined](https://huggingface.co/gmonsoon/OpenMia-Indo-Mistral-7b-v3-refined) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gmonsoon__OpenMia-Indo-Mistral-7b-v3-refined",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T19:41:28.521258](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__OpenMia-Indo-Mistral-7b-v3-refined/blob/main/results_2024-02-04T19-41-28.521258.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6291464441848997,
"acc_stderr": 0.0325052901080867,
"acc_norm": 0.6303599490256852,
"acc_norm_stderr": 0.03317036353110983,
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5394835979252358,
"mc2_stderr": 0.015320518165942699
},
"harness|arc:challenge|25": {
"acc": 0.6049488054607508,
"acc_stderr": 0.014285898292938167,
"acc_norm": 0.64419795221843,
"acc_norm_stderr": 0.01399057113791876
},
"harness|hellaswag|10": {
"acc": 0.640211113324039,
"acc_stderr": 0.004789575163418651,
"acc_norm": 0.842162915753834,
"acc_norm_stderr": 0.0036384306206139337
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.02490699045899257,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.02490699045899257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922986,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922986
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.03095663632856655,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.03095663632856655
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.016659279700295827,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.016659279700295827
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.040655781409087044,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.040655781409087044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.013853724170922533,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.013853724170922533
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26256983240223464,
"acc_stderr": 0.014716824273017765,
"acc_norm": 0.26256983240223464,
"acc_norm_stderr": 0.014716824273017765
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.025630824975621355,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.025630824975621355
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873862,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873862
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045704,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045704
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.019488025745529675,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.019488025745529675
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368043,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368043
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5394835979252358,
"mc2_stderr": 0.015320518165942699
},
"harness|winogrande|5": {
"acc": 0.8153117600631413,
"acc_stderr": 0.01090597811215688
},
"harness|gsm8k|5": {
"acc": 0.6125852918877938,
"acc_stderr": 0.01341879844782737
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_gmonsoon__OpenMia-Indo-Mistral-7b-v3-refined | [
"region:us"
] | 2024-02-04T17:50:19+00:00 | {"pretty_name": "Evaluation run of gmonsoon/OpenMia-Indo-Mistral-7b-v3-refined", "dataset_summary": "Dataset automatically created during the evaluation run of model [gmonsoon/OpenMia-Indo-Mistral-7b-v3-refined](https://huggingface.co/gmonsoon/OpenMia-Indo-Mistral-7b-v3-refined) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gmonsoon__OpenMia-Indo-Mistral-7b-v3-refined\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T19:41:28.521258](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__OpenMia-Indo-Mistral-7b-v3-refined/blob/main/results_2024-02-04T19-41-28.521258.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6291464441848997,\n \"acc_stderr\": 0.0325052901080867,\n \"acc_norm\": 0.6303599490256852,\n \"acc_norm_stderr\": 0.03317036353110983,\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5394835979252358,\n \"mc2_stderr\": 0.015320518165942699\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6049488054607508,\n \"acc_stderr\": 0.014285898292938167,\n \"acc_norm\": 0.64419795221843,\n \"acc_norm_stderr\": 0.01399057113791876\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.640211113324039,\n \"acc_stderr\": 0.004789575163418651,\n \"acc_norm\": 0.842162915753834,\n \"acc_norm_stderr\": 0.0036384306206139337\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922986,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922986\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.03095663632856655,\n \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.03095663632856655\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295827,\n \"acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295827\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.040655781409087044,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.040655781409087044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n \"acc_stderr\": 0.013853724170922533,\n \"acc_norm\": 0.8160919540229885,\n \"acc_norm_stderr\": 0.013853724170922533\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26256983240223464,\n \"acc_stderr\": 0.014716824273017765,\n \"acc_norm\": 0.26256983240223464,\n \"acc_norm_stderr\": 0.014716824273017765\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.025630824975621355,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.025630824975621355\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873862,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873862\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n \"acc_stderr\": 0.012739711554045704,\n \"acc_norm\": 0.4654498044328553,\n \"acc_norm_stderr\": 0.012739711554045704\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.019488025745529675,\n \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.019488025745529675\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368043,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368043\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5394835979252358,\n \"mc2_stderr\": 0.015320518165942699\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8153117600631413,\n \"acc_stderr\": 0.01090597811215688\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6125852918877938,\n \"acc_stderr\": 0.01341879844782737\n }\n}\n```", "repo_url": "https://huggingface.co/gmonsoon/OpenMia-Indo-Mistral-7b-v3-refined", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|arc:challenge|25_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|arc:challenge|25_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|gsm8k|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|gsm8k|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hellaswag|10_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hellaswag|10_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T17-47-56.805589.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T19-41-28.521258.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["**/details_harness|winogrande|5_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["**/details_harness|winogrande|5_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T19-41-28.521258.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T17_47_56.805589", "path": ["results_2024-02-04T17-47-56.805589.parquet"]}, {"split": "2024_02_04T19_41_28.521258", "path": ["results_2024-02-04T19-41-28.521258.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T19-41-28.521258.parquet"]}]}]} | 2024-02-04T19:43:50+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of gmonsoon/OpenMia-Indo-Mistral-7b-v3-refined
Dataset automatically created during the evaluation run of model gmonsoon/OpenMia-Indo-Mistral-7b-v3-refined on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T19:41:28.521258(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of gmonsoon/OpenMia-Indo-Mistral-7b-v3-refined\n\n\n\nDataset automatically created during the evaluation run of model gmonsoon/OpenMia-Indo-Mistral-7b-v3-refined on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T19:41:28.521258(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of gmonsoon/OpenMia-Indo-Mistral-7b-v3-refined\n\n\n\nDataset automatically created during the evaluation run of model gmonsoon/OpenMia-Indo-Mistral-7b-v3-refined on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T19:41:28.521258(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
f35600197aa64b9e058386446e4b857bda61913e |
# Dataset Card for Evaluation run of sethuiyer/Eida_10.7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sethuiyer/Eida_10.7B](https://huggingface.co/sethuiyer/Eida_10.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sethuiyer__Eida_10.7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T17:53:38.566336](https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__Eida_10.7B/blob/main/results_2024-02-04T17-53-38.566336.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6448820645120363,
"acc_stderr": 0.03227930431319581,
"acc_norm": 0.648072900208979,
"acc_norm_stderr": 0.03293002063207752,
"mc1": 0.5446756425948592,
"mc1_stderr": 0.01743349010253876,
"mc2": 0.713310602231156,
"mc2_stderr": 0.014826867339224056
},
"harness|arc:challenge|25": {
"acc": 0.6988054607508533,
"acc_stderr": 0.013406741767847632,
"acc_norm": 0.7090443686006825,
"acc_norm_stderr": 0.0132730778659076
},
"harness|hellaswag|10": {
"acc": 0.6917944632543318,
"acc_stderr": 0.004608082815535489,
"acc_norm": 0.8736307508464449,
"acc_norm_stderr": 0.00331585991885755
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894444,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.02315787934908353,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.02315787934908353
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033477,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033477
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461763,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461763
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.0338517797604481,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.0338517797604481
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476075,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476075
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.034624199316156234,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.034624199316156234
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368982,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368982
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41564245810055866,
"acc_stderr": 0.016482782187500666,
"acc_norm": 0.41564245810055866,
"acc_norm_stderr": 0.016482782187500666
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.02500646975579921,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.02500646975579921
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4595827900912647,
"acc_stderr": 0.01272844606766997,
"acc_norm": 0.4595827900912647,
"acc_norm_stderr": 0.01272844606766997
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495155,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495155
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233257,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233257
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.02954774168764004,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.02954774168764004
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5446756425948592,
"mc1_stderr": 0.01743349010253876,
"mc2": 0.713310602231156,
"mc2_stderr": 0.014826867339224056
},
"harness|winogrande|5": {
"acc": 0.8121546961325967,
"acc_stderr": 0.01097748110343509
},
"harness|gsm8k|5": {
"acc": 0.48142532221379836,
"acc_stderr": 0.013762977910317584
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_sethuiyer__Eida_10.7B | [
"region:us"
] | 2024-02-04T17:55:54+00:00 | {"pretty_name": "Evaluation run of sethuiyer/Eida_10.7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [sethuiyer/Eida_10.7B](https://huggingface.co/sethuiyer/Eida_10.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sethuiyer__Eida_10.7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T17:53:38.566336](https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__Eida_10.7B/blob/main/results_2024-02-04T17-53-38.566336.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6448820645120363,\n \"acc_stderr\": 0.03227930431319581,\n \"acc_norm\": 0.648072900208979,\n \"acc_norm_stderr\": 0.03293002063207752,\n \"mc1\": 0.5446756425948592,\n \"mc1_stderr\": 0.01743349010253876,\n \"mc2\": 0.713310602231156,\n \"mc2_stderr\": 0.014826867339224056\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6988054607508533,\n \"acc_stderr\": 0.013406741767847632,\n \"acc_norm\": 0.7090443686006825,\n \"acc_norm_stderr\": 0.0132730778659076\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6917944632543318,\n \"acc_stderr\": 0.004608082815535489,\n \"acc_norm\": 0.8736307508464449,\n \"acc_norm_stderr\": 0.00331585991885755\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894444,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894444\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.02315787934908353,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.02315787934908353\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033477,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033477\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5601851851851852,\n \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.0338517797604481\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476075,\n \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476075\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156234,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156234\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368982,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368982\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41564245810055866,\n \"acc_stderr\": 0.016482782187500666,\n \"acc_norm\": 0.41564245810055866,\n \"acc_norm_stderr\": 0.016482782187500666\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.02500646975579921,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.02500646975579921\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4595827900912647,\n \"acc_stderr\": 0.01272844606766997,\n \"acc_norm\": 0.4595827900912647,\n \"acc_norm_stderr\": 0.01272844606766997\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495155,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495155\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233257,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233257\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.02954774168764004,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.02954774168764004\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5446756425948592,\n \"mc1_stderr\": 0.01743349010253876,\n \"mc2\": 0.713310602231156,\n \"mc2_stderr\": 0.014826867339224056\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8121546961325967,\n \"acc_stderr\": 0.01097748110343509\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.48142532221379836,\n \"acc_stderr\": 0.013762977910317584\n }\n}\n```", "repo_url": "https://huggingface.co/sethuiyer/Eida_10.7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|arc:challenge|25_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|gsm8k|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hellaswag|10_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T17-53-38.566336.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["**/details_harness|winogrande|5_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T17-53-38.566336.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T17_53_38.566336", "path": ["results_2024-02-04T17-53-38.566336.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T17-53-38.566336.parquet"]}]}]} | 2024-02-04T17:56:20+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of sethuiyer/Eida_10.7B
Dataset automatically created during the evaluation run of model sethuiyer/Eida_10.7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T17:53:38.566336(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of sethuiyer/Eida_10.7B\n\n\n\nDataset automatically created during the evaluation run of model sethuiyer/Eida_10.7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T17:53:38.566336(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of sethuiyer/Eida_10.7B\n\n\n\nDataset automatically created during the evaluation run of model sethuiyer/Eida_10.7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T17:53:38.566336(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
405076073d444c05cc5b2303fb70e641ea24324d |
# Dataset Card for Evaluation run of paulml/OmniBeagleMBX-v3-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [paulml/OmniBeagleMBX-v3-7B](https://huggingface.co/paulml/OmniBeagleMBX-v3-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_paulml__OmniBeagleMBX-v3-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T18:02:06.576942](https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__OmniBeagleMBX-v3-7B/blob/main/results_2024-02-04T18-02-06.576942.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6530956002999532,
"acc_stderr": 0.032066218331287186,
"acc_norm": 0.6522913486257421,
"acc_norm_stderr": 0.03274033242209032,
"mc1": 0.5960832313341493,
"mc1_stderr": 0.01717727682258428,
"mc2": 0.735154877958957,
"mc2_stderr": 0.014562986084455403
},
"harness|arc:challenge|25": {
"acc": 0.712457337883959,
"acc_stderr": 0.013226719056266127,
"acc_norm": 0.7380546075085325,
"acc_norm_stderr": 0.01284905482685811
},
"harness|hellaswag|10": {
"acc": 0.7229635530770763,
"acc_stderr": 0.004466200055292544,
"acc_norm": 0.8906592312288388,
"acc_norm_stderr": 0.0031142850772280387
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404907,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097112,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097112
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42569832402234636,
"acc_stderr": 0.01653682964899711,
"acc_norm": 0.42569832402234636,
"acc_norm_stderr": 0.01653682964899711
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.012749206007657476,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.012749206007657476
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5960832313341493,
"mc1_stderr": 0.01717727682258428,
"mc2": 0.735154877958957,
"mc2_stderr": 0.014562986084455403
},
"harness|winogrande|5": {
"acc": 0.8539857932123125,
"acc_stderr": 0.009924440374585246
},
"harness|gsm8k|5": {
"acc": 0.6929492039423806,
"acc_stderr": 0.012705685723131707
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_paulml__OmniBeagleMBX-v3-7B | [
"region:us"
] | 2024-02-04T17:58:47+00:00 | {"pretty_name": "Evaluation run of paulml/OmniBeagleMBX-v3-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [paulml/OmniBeagleMBX-v3-7B](https://huggingface.co/paulml/OmniBeagleMBX-v3-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_paulml__OmniBeagleMBX-v3-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T18:02:06.576942](https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__OmniBeagleMBX-v3-7B/blob/main/results_2024-02-04T18-02-06.576942.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6530956002999532,\n \"acc_stderr\": 0.032066218331287186,\n \"acc_norm\": 0.6522913486257421,\n \"acc_norm_stderr\": 0.03274033242209032,\n \"mc1\": 0.5960832313341493,\n \"mc1_stderr\": 0.01717727682258428,\n \"mc2\": 0.735154877958957,\n \"mc2_stderr\": 0.014562986084455403\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.712457337883959,\n \"acc_stderr\": 0.013226719056266127,\n \"acc_norm\": 0.7380546075085325,\n \"acc_norm_stderr\": 0.01284905482685811\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7229635530770763,\n \"acc_stderr\": 0.004466200055292544,\n \"acc_norm\": 0.8906592312288388,\n \"acc_norm_stderr\": 0.0031142850772280387\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404907,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404907\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42569832402234636,\n \"acc_stderr\": 0.01653682964899711,\n \"acc_norm\": 0.42569832402234636,\n \"acc_norm_stderr\": 0.01653682964899711\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n \"acc_stderr\": 0.012749206007657476,\n \"acc_norm\": 0.47131681877444587,\n \"acc_norm_stderr\": 0.012749206007657476\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5960832313341493,\n \"mc1_stderr\": 0.01717727682258428,\n \"mc2\": 0.735154877958957,\n \"mc2_stderr\": 0.014562986084455403\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8539857932123125,\n \"acc_stderr\": 0.009924440374585246\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6929492039423806,\n \"acc_stderr\": 0.012705685723131707\n }\n}\n```", "repo_url": "https://huggingface.co/paulml/OmniBeagleMBX-v3-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|arc:challenge|25_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|arc:challenge|25_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|gsm8k|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|gsm8k|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hellaswag|10_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hellaswag|10_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T17-56-19.578202.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T18-02-06.576942.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["**/details_harness|winogrande|5_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["**/details_harness|winogrande|5_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T18-02-06.576942.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T17_56_19.578202", "path": ["results_2024-02-04T17-56-19.578202.parquet"]}, {"split": "2024_02_04T18_02_06.576942", "path": ["results_2024-02-04T18-02-06.576942.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T18-02-06.576942.parquet"]}]}]} | 2024-02-04T18:04:26+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of paulml/OmniBeagleMBX-v3-7B
Dataset automatically created during the evaluation run of model paulml/OmniBeagleMBX-v3-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T18:02:06.576942(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of paulml/OmniBeagleMBX-v3-7B\n\n\n\nDataset automatically created during the evaluation run of model paulml/OmniBeagleMBX-v3-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T18:02:06.576942(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of paulml/OmniBeagleMBX-v3-7B\n\n\n\nDataset automatically created during the evaluation run of model paulml/OmniBeagleMBX-v3-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T18:02:06.576942(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
56f6f5c1abdde8ccadf5217255a79c219a59dfa2 |
# Dataset Card for Evaluation run of Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test-2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test-2](https://huggingface.co/Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.3B-Cinder-Reason-Test-2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T20:26:53.463273](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.3B-Cinder-Reason-Test-2/blob/main/results_2024-02-04T20-26-53.463273.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2605954222902013,
"acc_stderr": 0.030887287206153434,
"acc_norm": 0.2609822344299048,
"acc_norm_stderr": 0.031636108991043924,
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602574,
"mc2": 0.372644846918848,
"mc2_stderr": 0.014009270688888235
},
"harness|arc:challenge|25": {
"acc": 0.29436860068259385,
"acc_stderr": 0.013318528460539422,
"acc_norm": 0.32764505119453924,
"acc_norm_stderr": 0.013715847940719346
},
"harness|hellaswag|10": {
"acc": 0.4347739494124676,
"acc_stderr": 0.004947141797384123,
"acc_norm": 0.5791674965146385,
"acc_norm_stderr": 0.004926837572202166
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.03712537833614866,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.03712537833614866
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2152777777777778,
"acc_stderr": 0.03437079344106134,
"acc_norm": 0.2152777777777778,
"acc_norm_stderr": 0.03437079344106134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483099,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483099
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.030472973363380045,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.030472973363380045
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843672,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843672
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.036196045241242515,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.036196045241242515
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.03108982600293753,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.03108982600293753
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21717171717171718,
"acc_stderr": 0.029376616484945637,
"acc_norm": 0.21717171717171718,
"acc_norm_stderr": 0.029376616484945637
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24870466321243523,
"acc_stderr": 0.03119584087770031,
"acc_norm": 0.24870466321243523,
"acc_norm_stderr": 0.03119584087770031
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2846153846153846,
"acc_stderr": 0.022878322799706287,
"acc_norm": 0.2846153846153846,
"acc_norm_stderr": 0.022878322799706287
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275805,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275805
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176896,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176896
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473834,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473834
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23302752293577983,
"acc_stderr": 0.0181256691808615,
"acc_norm": 0.23302752293577983,
"acc_norm_stderr": 0.0181256691808615
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3425925925925926,
"acc_stderr": 0.03236585252602158,
"acc_norm": 0.3425925925925926,
"acc_norm_stderr": 0.03236585252602158
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.029571601065753374,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.029571601065753374
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.37668161434977576,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.37668161434977576,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252628,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3312883435582822,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.3312883435582822,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.02812096650391441,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.02812096650391441
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2835249042145594,
"acc_stderr": 0.016117318166832283,
"acc_norm": 0.2835249042145594,
"acc_norm_stderr": 0.016117318166832283
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2861736334405145,
"acc_stderr": 0.025670259242188947,
"acc_norm": 0.2861736334405145,
"acc_norm_stderr": 0.025670259242188947
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.025123739226872395,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.025123739226872395
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2438070404172099,
"acc_stderr": 0.010966507972178475,
"acc_norm": 0.2438070404172099,
"acc_norm_stderr": 0.010966507972178475
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.025767252010855963,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.025767252010855963
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.32727272727272727,
"acc_stderr": 0.044942908662520896,
"acc_norm": 0.32727272727272727,
"acc_norm_stderr": 0.044942908662520896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21393034825870647,
"acc_stderr": 0.028996909693328927,
"acc_norm": 0.21393034825870647,
"acc_norm_stderr": 0.028996909693328927
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683227,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683227
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602574,
"mc2": 0.372644846918848,
"mc2_stderr": 0.014009270688888235
},
"harness|winogrande|5": {
"acc": 0.6479873717442778,
"acc_stderr": 0.013422874824929714
},
"harness|gsm8k|5": {
"acc": 0.028051554207733132,
"acc_stderr": 0.004548229533836337
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.3B-Cinder-Reason-Test-2 | [
"region:us"
] | 2024-02-04T18:19:35+00:00 | {"pretty_name": "Evaluation run of Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test-2", "dataset_summary": "Dataset automatically created during the evaluation run of model [Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test-2](https://huggingface.co/Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.3B-Cinder-Reason-Test-2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T20:26:53.463273](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.3B-Cinder-Reason-Test-2/blob/main/results_2024-02-04T20-26-53.463273.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2605954222902013,\n \"acc_stderr\": 0.030887287206153434,\n \"acc_norm\": 0.2609822344299048,\n \"acc_norm_stderr\": 0.031636108991043924,\n \"mc1\": 0.22643818849449204,\n \"mc1_stderr\": 0.014651337324602574,\n \"mc2\": 0.372644846918848,\n \"mc2_stderr\": 0.014009270688888235\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.29436860068259385,\n \"acc_stderr\": 0.013318528460539422,\n \"acc_norm\": 0.32764505119453924,\n \"acc_norm_stderr\": 0.013715847940719346\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4347739494124676,\n \"acc_stderr\": 0.004947141797384123,\n \"acc_norm\": 0.5791674965146385,\n \"acc_norm_stderr\": 0.004926837572202166\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.03712537833614866,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.03712537833614866\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03317672787533157,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03317672787533157\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n \"acc_stderr\": 0.03437079344106134,\n \"acc_norm\": 0.2152777777777778,\n \"acc_norm_stderr\": 0.03437079344106134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.03242414757483099,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.03242414757483099\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.030472973363380045,\n \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.030472973363380045\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.04096985139843672,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.04096985139843672\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.24838709677419354,\n \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.03108982600293753,\n \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.03108982600293753\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945637,\n \"acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945637\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.24870466321243523,\n \"acc_stderr\": 0.03119584087770031,\n \"acc_norm\": 0.24870466321243523,\n \"acc_norm_stderr\": 0.03119584087770031\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2846153846153846,\n \"acc_stderr\": 0.022878322799706287,\n \"acc_norm\": 0.2846153846153846,\n \"acc_norm_stderr\": 0.022878322799706287\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275805,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275805\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176896,\n \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176896\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473834,\n \"acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473834\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23302752293577983,\n \"acc_stderr\": 0.0181256691808615,\n \"acc_norm\": 0.23302752293577983,\n \"acc_norm_stderr\": 0.0181256691808615\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3425925925925926,\n \"acc_stderr\": 0.03236585252602158,\n \"acc_norm\": 0.3425925925925926,\n \"acc_norm_stderr\": 0.03236585252602158\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2911392405063291,\n \"acc_stderr\": 0.029571601065753374,\n \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.029571601065753374\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.37668161434977576,\n \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.37668161434977576,\n \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.04077494709252628,\n \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.04077494709252628\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3312883435582822,\n \"acc_stderr\": 0.03697983910025588,\n \"acc_norm\": 0.3312883435582822,\n \"acc_norm_stderr\": 0.03697983910025588\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24358974358974358,\n \"acc_stderr\": 0.02812096650391441,\n \"acc_norm\": 0.24358974358974358,\n \"acc_norm_stderr\": 0.02812096650391441\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2835249042145594,\n \"acc_stderr\": 0.016117318166832283,\n \"acc_norm\": 0.2835249042145594,\n \"acc_norm_stderr\": 0.016117318166832283\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2861736334405145,\n \"acc_stderr\": 0.025670259242188947,\n \"acc_norm\": 0.2861736334405145,\n \"acc_norm_stderr\": 0.025670259242188947\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23049645390070922,\n \"acc_stderr\": 0.025123739226872395,\n \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.025123739226872395\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2438070404172099,\n \"acc_stderr\": 0.010966507972178475,\n \"acc_norm\": 0.2438070404172099,\n \"acc_norm_stderr\": 0.010966507972178475\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.025767252010855963,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.025767252010855963\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.32727272727272727,\n \"acc_stderr\": 0.044942908662520896,\n \"acc_norm\": 0.32727272727272727,\n \"acc_norm_stderr\": 0.044942908662520896\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17959183673469387,\n \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.17959183673469387,\n \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21393034825870647,\n \"acc_stderr\": 0.028996909693328927,\n \"acc_norm\": 0.21393034825870647,\n \"acc_norm_stderr\": 0.028996909693328927\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n \"acc_stderr\": 0.03647168523683227,\n \"acc_norm\": 0.3253012048192771,\n \"acc_norm_stderr\": 0.03647168523683227\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.031267817146631786,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.031267817146631786\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22643818849449204,\n \"mc1_stderr\": 0.014651337324602574,\n \"mc2\": 0.372644846918848,\n \"mc2_stderr\": 0.014009270688888235\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6479873717442778,\n \"acc_stderr\": 0.013422874824929714\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.028051554207733132,\n \"acc_stderr\": 0.004548229533836337\n }\n}\n```", "repo_url": "https://huggingface.co/Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test-2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|arc:challenge|25_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|arc:challenge|25_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|arc:challenge|25_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|gsm8k|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|gsm8k|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|gsm8k|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hellaswag|10_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hellaswag|10_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hellaswag|10_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T18-17-11.697806.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T18-52-11.664162.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T20-26-53.463273.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["**/details_harness|winogrande|5_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["**/details_harness|winogrande|5_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["**/details_harness|winogrande|5_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T20-26-53.463273.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T18_17_11.697806", "path": ["results_2024-02-04T18-17-11.697806.parquet"]}, {"split": "2024_02_04T18_52_11.664162", "path": ["results_2024-02-04T18-52-11.664162.parquet"]}, {"split": "2024_02_04T20_26_53.463273", "path": ["results_2024-02-04T20-26-53.463273.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T20-26-53.463273.parquet"]}]}]} | 2024-02-04T20:29:17+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test-2
Dataset automatically created during the evaluation run of model Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test-2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T20:26:53.463273(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test-2\n\n\n\nDataset automatically created during the evaluation run of model Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T20:26:53.463273(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test-2\n\n\n\nDataset automatically created during the evaluation run of model Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T20:26:53.463273(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
4b772da0e2cf6fdd856d95c35ca468b4dfa6a800 | ## id2label = {
## 0: 'binary visual question answering',
## 1:'search by image',
## 2:'image search by text',
## 3:'geospatial question answering',
## 4:'count objects in image' ,
## 5:'object extraction in image',
## 6:'image segmentation'
## }
---
size_categories:
- 1K<n<10K
task_categories:
- text-classification
dataset_info:
features:
- name: request
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 99611
num_examples: 1120
- name: validation
num_bytes: 24896
num_examples: 280
download_size: 27907
dataset_size: 124507
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
| myrtotsok/clf | [
"region:us"
] | 2024-02-04T18:22:03+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "request", "dtype": "string"}, {"name": "label", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 96731, "num_examples": 1120}, {"name": "validation", "num_bytes": 24176, "num_examples": 280}], "download_size": 27784, "dataset_size": 120907}} | 2024-02-06T08:38:02+00:00 | [] | [] | TAGS
#region-us
| ## id2label = {
## 0: 'binary visual question answering',
## 1:'search by image',
## 2:'image search by text',
## 3:'geospatial question answering',
## 4:'count objects in image' ,
## 5:'object extraction in image',
## 6:'image segmentation'
## }
---
size_categories:
- 1K<n<10K
task_categories:
- text-classification
dataset_info:
features:
- name: request
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 99611
num_examples: 1120
- name: validation
num_bytes: 24896
num_examples: 280
download_size: 27907
dataset_size: 124507
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
| [
"## id2label = {",
"## 0: 'binary visual question answering',",
"## 1:'search by image',",
"## 2:'image search by text',",
"## 3:'geospatial question answering',",
"## 4:'count objects in image' ,",
"## 5:'object extraction in image',",
"## 6:'image segmentation'",
"## }\n\n---\nsize_categories:\n- 1K<n<10K\ntask_categories:\n- text-classification\ndataset_info:\n features:\n - name: request\n dtype: string\n - name: label\n dtype: int64\n splits:\n - name: train\n num_bytes: 99611\n num_examples: 1120\n - name: validation\n num_bytes: 24896\n num_examples: 280\n download_size: 27907\n dataset_size: 124507\nconfigs:\n- config_name: default\n data_files:\n - split: train\n path: data/train-*\n - split: validation\n path: data/validation-*\n---"
] | [
"TAGS\n#region-us \n",
"## id2label = {",
"## 0: 'binary visual question answering',",
"## 1:'search by image',",
"## 2:'image search by text',",
"## 3:'geospatial question answering',",
"## 4:'count objects in image' ,",
"## 5:'object extraction in image',",
"## 6:'image segmentation'",
"## }\n\n---\nsize_categories:\n- 1K<n<10K\ntask_categories:\n- text-classification\ndataset_info:\n features:\n - name: request\n dtype: string\n - name: label\n dtype: int64\n splits:\n - name: train\n num_bytes: 99611\n num_examples: 1120\n - name: validation\n num_bytes: 24896\n num_examples: 280\n download_size: 27907\n dataset_size: 124507\nconfigs:\n- config_name: default\n data_files:\n - split: train\n path: data/train-*\n - split: validation\n path: data/validation-*\n---"
] |
c1d16902c8509c1f5aff8ba79d4dd1371e1d7d09 |
# ArabicaQA
ArabicaQA: Comprehensive Dataset for Arabic Question Answering
This repository contains dataset for paper *ArabicaQA: Comprehensive Dataset for Arabic Question Answering*. Below, we provide details regarding the materials available in this repository:
## Dataset
Within this folder, you will find the training, validation, and test sets of the ArabicaQA dataset. Refer to the table below for the dataset statistics:
| | Training | Validation | Test |
| -------------------|----------|------------|--------|
| MRC (with answers) | 62,186 | 13,483 | 13,426 |
| MRC (unanswerable) | 2,596 | 561 | 544 |
| Open-Domain | 62,057 | 13,475 | 13,414 |
| Open-Domain | 58,528 | 12,541 | 12,541 |
| abdoelsayed/ArabicaQA | [
"task_categories:question-answering",
"annotations_creators:crowdsourced",
"language_creators:crowdsourced",
"language_creators:found",
"size_categories:10K<n<100K",
"language:ar",
"license:mit",
"region:us"
] | 2024-02-04T18:37:18+00:00 | {"annotations_creators": ["crowdsourced"], "language_creators": ["crowdsourced", "found"], "language": ["ar"], "license": "mit", "size_categories": ["10K<n<100K"], "task_categories": ["question-answering"], "pretty_name": "ArabicaQA"} | 2024-02-04T23:00:10+00:00 | [] | [
"ar"
] | TAGS
#task_categories-question-answering #annotations_creators-crowdsourced #language_creators-crowdsourced #language_creators-found #size_categories-10K<n<100K #language-Arabic #license-mit #region-us
| ArabicaQA
=========
ArabicaQA: Comprehensive Dataset for Arabic Question Answering
This repository contains dataset for paper *ArabicaQA: Comprehensive Dataset for Arabic Question Answering*. Below, we provide details regarding the materials available in this repository:
Dataset
-------
Within this folder, you will find the training, validation, and test sets of the ArabicaQA dataset. Refer to the table below for the dataset statistics:
| [] | [
"TAGS\n#task_categories-question-answering #annotations_creators-crowdsourced #language_creators-crowdsourced #language_creators-found #size_categories-10K<n<100K #language-Arabic #license-mit #region-us \n"
] |
1a6b4497478925187b5458fda6a4c0605d696cd3 |
# Dataset Card for Evaluation run of vikash06/doctorMistralLLM10k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vikash06/doctorMistralLLM10k](https://huggingface.co/vikash06/doctorMistralLLM10k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vikash06__doctorMistralLLM10k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-05T06:54:54.227795](https://huggingface.co/datasets/open-llm-leaderboard/details_vikash06__doctorMistralLLM10k/blob/main/results_2024-02-05T06-54-54.227795.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2580661918322065,
"acc_stderr": 0.03078651762487308,
"acc_norm": 0.25995106169689164,
"acc_norm_stderr": 0.031613975800093994,
"mc1": 0.2717258261933905,
"mc1_stderr": 0.015572840452875828,
"mc2": 0.4827927075620741,
"mc2_stderr": 0.016686387374993165
},
"harness|arc:challenge|25": {
"acc": 0.20477815699658702,
"acc_stderr": 0.011792544338513405,
"acc_norm": 0.2721843003412969,
"acc_norm_stderr": 0.013006600406423709
},
"harness|hellaswag|10": {
"acc": 0.2591117307309301,
"acc_stderr": 0.00437251606016475,
"acc_norm": 0.27454690300736906,
"acc_norm_stderr": 0.00445373590094783
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.04094376269996793,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.04094376269996793
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3355263157894737,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.3355263157894737,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.02544786382510861,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.02544786382510861
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.03126511206173044,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.03126511206173044
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.22127659574468084,
"acc_stderr": 0.02713634960242406,
"acc_norm": 0.22127659574468084,
"acc_norm_stderr": 0.02713634960242406
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2,
"acc_stderr": 0.033333333333333284,
"acc_norm": 0.2,
"acc_norm_stderr": 0.033333333333333284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.14285714285714285,
"acc_stderr": 0.0312984318574381,
"acc_norm": 0.14285714285714285,
"acc_norm_stderr": 0.0312984318574381
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.032257994762334846,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.032257994762334846
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.035243908445117836,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.035243908445117836
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03053289223393202,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03053289223393202
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752964,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752964
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.022421273612923707,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.022421273612923707
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145668,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3487394957983193,
"acc_stderr": 0.03095663632856655,
"acc_norm": 0.3487394957983193,
"acc_norm_stderr": 0.03095663632856655
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22935779816513763,
"acc_stderr": 0.018025349724618684,
"acc_norm": 0.22935779816513763,
"acc_norm_stderr": 0.018025349724618684
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.02933116229425172,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.02933116229425172
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293433,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293433
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3452914798206278,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.3452914798206278,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.038968789850704164,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.038968789850704164
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.040073418097558065,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.040073418097558065
},
"harness|hendrycksTest-management|5": {
"acc": 0.1650485436893204,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.1650485436893204,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2707535121328225,
"acc_stderr": 0.015889888362560486,
"acc_norm": 0.2707535121328225,
"acc_norm_stderr": 0.015889888362560486
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22254335260115607,
"acc_stderr": 0.02239421566194282,
"acc_norm": 0.22254335260115607,
"acc_norm_stderr": 0.02239421566194282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2861736334405145,
"acc_stderr": 0.02567025924218896,
"acc_norm": 0.2861736334405145,
"acc_norm_stderr": 0.02567025924218896
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2932098765432099,
"acc_stderr": 0.02532988817190092,
"acc_norm": 0.2932098765432099,
"acc_norm_stderr": 0.02532988817190092
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.02601199293090201,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.02601199293090201
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2379400260756193,
"acc_stderr": 0.010875700787694245,
"acc_norm": 0.2379400260756193,
"acc_norm_stderr": 0.010875700787694245
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.02972215209928006,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.02972215209928006
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2173202614379085,
"acc_stderr": 0.01668482092914861,
"acc_norm": 0.2173202614379085,
"acc_norm_stderr": 0.01668482092914861
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.04172343038705383,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.04172343038705383
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.23673469387755103,
"acc_stderr": 0.027212835884073146,
"acc_norm": 0.23673469387755103,
"acc_norm_stderr": 0.027212835884073146
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.02992941540834839,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.02992941540834839
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.35542168674698793,
"acc_stderr": 0.03726214354322415,
"acc_norm": 0.35542168674698793,
"acc_norm_stderr": 0.03726214354322415
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2717258261933905,
"mc1_stderr": 0.015572840452875828,
"mc2": 0.4827927075620741,
"mc2_stderr": 0.016686387374993165
},
"harness|winogrande|5": {
"acc": 0.4877663772691397,
"acc_stderr": 0.01404827882040562
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_vikash06__doctorMistralLLM10k | [
"region:us"
] | 2024-02-04T19:04:00+00:00 | {"pretty_name": "Evaluation run of vikash06/doctorMistralLLM10k", "dataset_summary": "Dataset automatically created during the evaluation run of model [vikash06/doctorMistralLLM10k](https://huggingface.co/vikash06/doctorMistralLLM10k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vikash06__doctorMistralLLM10k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-05T06:54:54.227795](https://huggingface.co/datasets/open-llm-leaderboard/details_vikash06__doctorMistralLLM10k/blob/main/results_2024-02-05T06-54-54.227795.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2580661918322065,\n \"acc_stderr\": 0.03078651762487308,\n \"acc_norm\": 0.25995106169689164,\n \"acc_norm_stderr\": 0.031613975800093994,\n \"mc1\": 0.2717258261933905,\n \"mc1_stderr\": 0.015572840452875828,\n \"mc2\": 0.4827927075620741,\n \"mc2_stderr\": 0.016686387374993165\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.20477815699658702,\n \"acc_stderr\": 0.011792544338513405,\n \"acc_norm\": 0.2721843003412969,\n \"acc_norm_stderr\": 0.013006600406423709\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2591117307309301,\n \"acc_stderr\": 0.00437251606016475,\n \"acc_norm\": 0.27454690300736906,\n \"acc_norm_stderr\": 0.00445373590094783\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.04094376269996793,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.04094376269996793\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3355263157894737,\n \"acc_stderr\": 0.03842498559395268,\n \"acc_norm\": 0.3355263157894737,\n \"acc_norm_stderr\": 0.03842498559395268\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.02544786382510861,\n \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.02544786382510861\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.03126511206173044,\n \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.03126511206173044\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.22127659574468084,\n \"acc_stderr\": 0.02713634960242406,\n \"acc_norm\": 0.22127659574468084,\n \"acc_norm_stderr\": 0.02713634960242406\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.033333333333333284,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.033333333333333284\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.14285714285714285,\n \"acc_stderr\": 0.0312984318574381,\n \"acc_norm\": 0.14285714285714285,\n \"acc_norm_stderr\": 0.0312984318574381\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.032257994762334846,\n \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.032257994762334846\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.035243908445117836,\n \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.035243908445117836\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03053289223393202,\n \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03053289223393202\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752964,\n \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752964\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.022421273612923707,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.022421273612923707\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3487394957983193,\n \"acc_stderr\": 0.03095663632856655,\n \"acc_norm\": 0.3487394957983193,\n \"acc_norm_stderr\": 0.03095663632856655\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.22935779816513763,\n \"acc_stderr\": 0.018025349724618684,\n \"acc_norm\": 0.22935779816513763,\n \"acc_norm_stderr\": 0.018025349724618684\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.02933116229425172,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.02933116229425172\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293433,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293433\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3452914798206278,\n \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.3452914798206278,\n \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.038968789850704164,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.038968789850704164\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n \"acc_stderr\": 0.040073418097558065,\n \"acc_norm\": 0.23214285714285715,\n \"acc_norm_stderr\": 0.040073418097558065\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1650485436893204,\n \"acc_stderr\": 0.036756688322331886,\n \"acc_norm\": 0.1650485436893204,\n \"acc_norm_stderr\": 0.036756688322331886\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2707535121328225,\n \"acc_stderr\": 0.015889888362560486,\n \"acc_norm\": 0.2707535121328225,\n \"acc_norm_stderr\": 0.015889888362560486\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.22254335260115607,\n \"acc_stderr\": 0.02239421566194282,\n \"acc_norm\": 0.22254335260115607,\n \"acc_norm_stderr\": 0.02239421566194282\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.02495418432487991,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.02495418432487991\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2861736334405145,\n \"acc_stderr\": 0.02567025924218896,\n \"acc_norm\": 0.2861736334405145,\n \"acc_norm_stderr\": 0.02567025924218896\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2932098765432099,\n \"acc_stderr\": 0.02532988817190092,\n \"acc_norm\": 0.2932098765432099,\n \"acc_norm_stderr\": 0.02532988817190092\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02601199293090201,\n \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02601199293090201\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2379400260756193,\n \"acc_stderr\": 0.010875700787694245,\n \"acc_norm\": 0.2379400260756193,\n \"acc_norm_stderr\": 0.010875700787694245\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.39705882352941174,\n \"acc_stderr\": 0.02972215209928006,\n \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.02972215209928006\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2173202614379085,\n \"acc_stderr\": 0.01668482092914861,\n \"acc_norm\": 0.2173202614379085,\n \"acc_norm_stderr\": 0.01668482092914861\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.04172343038705383,\n \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.04172343038705383\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.23673469387755103,\n \"acc_stderr\": 0.027212835884073146,\n \"acc_norm\": 0.23673469387755103,\n \"acc_norm_stderr\": 0.027212835884073146\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n \"acc_stderr\": 0.02992941540834839,\n \"acc_norm\": 0.23383084577114427,\n \"acc_norm_stderr\": 0.02992941540834839\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.35542168674698793,\n \"acc_stderr\": 0.03726214354322415,\n \"acc_norm\": 0.35542168674698793,\n \"acc_norm_stderr\": 0.03726214354322415\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2717258261933905,\n \"mc1_stderr\": 0.015572840452875828,\n \"mc2\": 0.4827927075620741,\n \"mc2_stderr\": 0.016686387374993165\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4877663772691397,\n \"acc_stderr\": 0.01404827882040562\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/vikash06/doctorMistralLLM10k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|arc:challenge|25_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|arc:challenge|25_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|arc:challenge|25_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|gsm8k|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|gsm8k|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|gsm8k|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hellaswag|10_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hellaswag|10_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hellaswag|10_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T19-01-38.586623.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T04-20-20.106665.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T06-54-54.227795.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["**/details_harness|winogrande|5_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["**/details_harness|winogrande|5_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["**/details_harness|winogrande|5_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-05T06-54-54.227795.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T19_01_38.586623", "path": ["results_2024-02-04T19-01-38.586623.parquet"]}, {"split": "2024_02_05T04_20_20.106665", "path": ["results_2024-02-05T04-20-20.106665.parquet"]}, {"split": "2024_02_05T06_54_54.227795", "path": ["results_2024-02-05T06-54-54.227795.parquet"]}, {"split": "latest", "path": ["results_2024-02-05T06-54-54.227795.parquet"]}]}]} | 2024-02-05T06:57:25+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of vikash06/doctorMistralLLM10k
Dataset automatically created during the evaluation run of model vikash06/doctorMistralLLM10k on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-05T06:54:54.227795(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of vikash06/doctorMistralLLM10k\n\n\n\nDataset automatically created during the evaluation run of model vikash06/doctorMistralLLM10k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T06:54:54.227795(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of vikash06/doctorMistralLLM10k\n\n\n\nDataset automatically created during the evaluation run of model vikash06/doctorMistralLLM10k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T06:54:54.227795(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
cc555dec0e4bc525c10a504ac5c2971a847e1fc5 |
# Dataset Card for Evaluation run of EleutherAI/llemma_7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [EleutherAI/llemma_7b](https://huggingface.co/EleutherAI/llemma_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__llemma_7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T21:06:01.286568](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__llemma_7b/blob/main/results_2024-02-04T21-06-01.286568.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4779475755535138,
"acc_stderr": 0.034922008654048396,
"acc_norm": 0.4810020105261036,
"acc_norm_stderr": 0.0356546932543351,
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476199,
"mc2": 0.3887901119268913,
"mc2_stderr": 0.014502145592953165
},
"harness|arc:challenge|25": {
"acc": 0.43856655290102387,
"acc_stderr": 0.014500682618212864,
"acc_norm": 0.4616040955631399,
"acc_norm_stderr": 0.014568245550296356
},
"harness|hellaswag|10": {
"acc": 0.46265684126667994,
"acc_stderr": 0.00497584533508662,
"acc_norm": 0.6297550288787094,
"acc_norm_stderr": 0.004818833521340353
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.04017901275981748,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.04017901275981748
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4867924528301887,
"acc_stderr": 0.030762134874500476,
"acc_norm": 0.4867924528301887,
"acc_norm_stderr": 0.030762134874500476
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.03789401760283648,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.03789401760283648
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709390974,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709390974
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.025542846817400506,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.025542846817400506
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949097,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949097
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5225806451612903,
"acc_stderr": 0.02841498501970786,
"acc_norm": 0.5225806451612903,
"acc_norm_stderr": 0.02841498501970786
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486518,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486518
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5151515151515151,
"acc_stderr": 0.03902551007374448,
"acc_norm": 0.5151515151515151,
"acc_norm_stderr": 0.03902551007374448
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5656565656565656,
"acc_stderr": 0.03531505879359184,
"acc_norm": 0.5656565656565656,
"acc_norm_stderr": 0.03531505879359184
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5284974093264249,
"acc_stderr": 0.036025735712884414,
"acc_norm": 0.5284974093264249,
"acc_norm_stderr": 0.036025735712884414
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4641025641025641,
"acc_stderr": 0.025285585990017845,
"acc_norm": 0.4641025641025641,
"acc_norm_stderr": 0.025285585990017845
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066468,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066468
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5126050420168067,
"acc_stderr": 0.032468167657521745,
"acc_norm": 0.5126050420168067,
"acc_norm_stderr": 0.032468167657521745
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6238532110091743,
"acc_stderr": 0.02076923196820508,
"acc_norm": 0.6238532110091743,
"acc_norm_stderr": 0.02076923196820508
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5,
"acc_stderr": 0.03509312031717982,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03509312031717982
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5864978902953587,
"acc_stderr": 0.03205649904851859,
"acc_norm": 0.5864978902953587,
"acc_norm_stderr": 0.03205649904851859
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.40358744394618834,
"acc_stderr": 0.032928028193303135,
"acc_norm": 0.40358744394618834,
"acc_norm_stderr": 0.032928028193303135
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48854961832061067,
"acc_stderr": 0.043841400240780176,
"acc_norm": 0.48854961832061067,
"acc_norm_stderr": 0.043841400240780176
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.558282208588957,
"acc_stderr": 0.03901591825836184,
"acc_norm": 0.558282208588957,
"acc_norm_stderr": 0.03901591825836184
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.6116504854368932,
"acc_stderr": 0.04825729337356389,
"acc_norm": 0.6116504854368932,
"acc_norm_stderr": 0.04825729337356389
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6709401709401709,
"acc_stderr": 0.030782321577688173,
"acc_norm": 0.6709401709401709,
"acc_norm_stderr": 0.030782321577688173
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5568326947637292,
"acc_stderr": 0.0177640850353484,
"acc_norm": 0.5568326947637292,
"acc_norm_stderr": 0.0177640850353484
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.02686462436675665,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.02686462436675665
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.311731843575419,
"acc_stderr": 0.015491756531894638,
"acc_norm": 0.311731843575419,
"acc_norm_stderr": 0.015491756531894638
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.028580341065138296,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.028580341065138296
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5016077170418006,
"acc_stderr": 0.02839794490780661,
"acc_norm": 0.5016077170418006,
"acc_norm_stderr": 0.02839794490780661
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.02764847787741332,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.02764847787741332
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.32269503546099293,
"acc_stderr": 0.027889139300534802,
"acc_norm": 0.32269503546099293,
"acc_norm_stderr": 0.027889139300534802
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.36114732724902215,
"acc_stderr": 0.012267935477519039,
"acc_norm": 0.36114732724902215,
"acc_norm_stderr": 0.012267935477519039
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.029896163033125464,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.029896163033125464
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.42320261437908496,
"acc_stderr": 0.01998780976948206,
"acc_norm": 0.42320261437908496,
"acc_norm_stderr": 0.01998780976948206
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5673469387755102,
"acc_stderr": 0.031717528240626645,
"acc_norm": 0.5673469387755102,
"acc_norm_stderr": 0.031717528240626645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6716417910447762,
"acc_stderr": 0.03320685889744324,
"acc_norm": 0.6716417910447762,
"acc_norm_stderr": 0.03320685889744324
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.03834234744164993,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.03834234744164993
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476199,
"mc2": 0.3887901119268913,
"mc2_stderr": 0.014502145592953165
},
"harness|winogrande|5": {
"acc": 0.632991318074191,
"acc_stderr": 0.013546284512919645
},
"harness|gsm8k|5": {
"acc": 0.332827899924185,
"acc_stderr": 0.012979892496598281
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_EleutherAI__llemma_7b | [
"region:us"
] | 2024-02-04T19:10:00+00:00 | {"pretty_name": "Evaluation run of EleutherAI/llemma_7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [EleutherAI/llemma_7b](https://huggingface.co/EleutherAI/llemma_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__llemma_7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T21:06:01.286568](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__llemma_7b/blob/main/results_2024-02-04T21-06-01.286568.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4779475755535138,\n \"acc_stderr\": 0.034922008654048396,\n \"acc_norm\": 0.4810020105261036,\n \"acc_norm_stderr\": 0.0356546932543351,\n \"mc1\": 0.2582619339045288,\n \"mc1_stderr\": 0.015321821688476199,\n \"mc2\": 0.3887901119268913,\n \"mc2_stderr\": 0.014502145592953165\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.43856655290102387,\n \"acc_stderr\": 0.014500682618212864,\n \"acc_norm\": 0.4616040955631399,\n \"acc_norm_stderr\": 0.014568245550296356\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.46265684126667994,\n \"acc_stderr\": 0.00497584533508662,\n \"acc_norm\": 0.6297550288787094,\n \"acc_norm_stderr\": 0.004818833521340353\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04017901275981748,\n \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04017901275981748\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4867924528301887,\n \"acc_stderr\": 0.030762134874500476,\n \"acc_norm\": 0.4867924528301887,\n \"acc_norm_stderr\": 0.030762134874500476\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n \"acc_stderr\": 0.03789401760283648,\n \"acc_norm\": 0.44508670520231214,\n \"acc_norm_stderr\": 0.03789401760283648\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709390974,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709390974\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.025542846817400506,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.025542846817400506\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.04343525428949097,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.04343525428949097\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5225806451612903,\n \"acc_stderr\": 0.02841498501970786,\n \"acc_norm\": 0.5225806451612903,\n \"acc_norm_stderr\": 0.02841498501970786\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486518,\n \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486518\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5151515151515151,\n \"acc_stderr\": 0.03902551007374448,\n \"acc_norm\": 0.5151515151515151,\n \"acc_norm_stderr\": 0.03902551007374448\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5656565656565656,\n \"acc_stderr\": 0.03531505879359184,\n \"acc_norm\": 0.5656565656565656,\n \"acc_norm_stderr\": 0.03531505879359184\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.5284974093264249,\n \"acc_stderr\": 0.036025735712884414,\n \"acc_norm\": 0.5284974093264249,\n \"acc_norm_stderr\": 0.036025735712884414\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4641025641025641,\n \"acc_stderr\": 0.025285585990017845,\n \"acc_norm\": 0.4641025641025641,\n \"acc_norm_stderr\": 0.025285585990017845\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066468,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066468\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5126050420168067,\n \"acc_stderr\": 0.032468167657521745,\n \"acc_norm\": 0.5126050420168067,\n \"acc_norm_stderr\": 0.032468167657521745\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6238532110091743,\n \"acc_stderr\": 0.02076923196820508,\n \"acc_norm\": 0.6238532110091743,\n \"acc_norm_stderr\": 0.02076923196820508\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502325,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502325\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03509312031717982,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03509312031717982\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5864978902953587,\n \"acc_stderr\": 0.03205649904851859,\n \"acc_norm\": 0.5864978902953587,\n \"acc_norm_stderr\": 0.03205649904851859\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.40358744394618834,\n \"acc_stderr\": 0.032928028193303135,\n \"acc_norm\": 0.40358744394618834,\n \"acc_norm_stderr\": 0.032928028193303135\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.48854961832061067,\n \"acc_stderr\": 0.043841400240780176,\n \"acc_norm\": 0.48854961832061067,\n \"acc_norm_stderr\": 0.043841400240780176\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.558282208588957,\n \"acc_stderr\": 0.03901591825836184,\n \"acc_norm\": 0.558282208588957,\n \"acc_norm_stderr\": 0.03901591825836184\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.04825729337356389,\n \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.04825729337356389\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6709401709401709,\n \"acc_stderr\": 0.030782321577688173,\n \"acc_norm\": 0.6709401709401709,\n \"acc_norm_stderr\": 0.030782321577688173\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5568326947637292,\n \"acc_stderr\": 0.0177640850353484,\n \"acc_norm\": 0.5568326947637292,\n \"acc_norm_stderr\": 0.0177640850353484\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5317919075144508,\n \"acc_stderr\": 0.02686462436675665,\n \"acc_norm\": 0.5317919075144508,\n \"acc_norm_stderr\": 0.02686462436675665\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.311731843575419,\n \"acc_stderr\": 0.015491756531894638,\n \"acc_norm\": 0.311731843575419,\n \"acc_norm_stderr\": 0.015491756531894638\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.028580341065138296,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.028580341065138296\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5016077170418006,\n \"acc_stderr\": 0.02839794490780661,\n \"acc_norm\": 0.5016077170418006,\n \"acc_norm_stderr\": 0.02839794490780661\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.02764847787741332,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.02764847787741332\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.32269503546099293,\n \"acc_stderr\": 0.027889139300534802,\n \"acc_norm\": 0.32269503546099293,\n \"acc_norm_stderr\": 0.027889139300534802\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36114732724902215,\n \"acc_stderr\": 0.012267935477519039,\n \"acc_norm\": 0.36114732724902215,\n \"acc_norm_stderr\": 0.012267935477519039\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.029896163033125464,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.029896163033125464\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.42320261437908496,\n \"acc_stderr\": 0.01998780976948206,\n \"acc_norm\": 0.42320261437908496,\n \"acc_norm_stderr\": 0.01998780976948206\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5673469387755102,\n \"acc_stderr\": 0.031717528240626645,\n \"acc_norm\": 0.5673469387755102,\n \"acc_norm_stderr\": 0.031717528240626645\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n \"acc_stderr\": 0.03320685889744324,\n \"acc_norm\": 0.6716417910447762,\n \"acc_norm_stderr\": 0.03320685889744324\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.03834234744164993,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.03834234744164993\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2582619339045288,\n \"mc1_stderr\": 0.015321821688476199,\n \"mc2\": 0.3887901119268913,\n \"mc2_stderr\": 0.014502145592953165\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.632991318074191,\n \"acc_stderr\": 0.013546284512919645\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.332827899924185,\n \"acc_stderr\": 0.012979892496598281\n }\n}\n```", "repo_url": "https://huggingface.co/EleutherAI/llemma_7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|arc:challenge|25_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|arc:challenge|25_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|gsm8k|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|gsm8k|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hellaswag|10_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hellaswag|10_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T19-07-36.777097.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T21-06-01.286568.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["**/details_harness|winogrande|5_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["**/details_harness|winogrande|5_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T21-06-01.286568.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T19_07_36.777097", "path": ["results_2024-02-04T19-07-36.777097.parquet"]}, {"split": "2024_02_04T21_06_01.286568", "path": ["results_2024-02-04T21-06-01.286568.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T21-06-01.286568.parquet"]}]}]} | 2024-02-04T21:08:27+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of EleutherAI/llemma_7b
Dataset automatically created during the evaluation run of model EleutherAI/llemma_7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T21:06:01.286568(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of EleutherAI/llemma_7b\n\n\n\nDataset automatically created during the evaluation run of model EleutherAI/llemma_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T21:06:01.286568(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EleutherAI/llemma_7b\n\n\n\nDataset automatically created during the evaluation run of model EleutherAI/llemma_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T21:06:01.286568(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
88ec74c3c0b2f67342c3f3b28f10b6b5d6a2caad | # Dataset Card for "ai4bharat-hi-subset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | zicsx/ai4bharat-hi-subset | [
"task_categories:text-generation",
"size_categories:100M<n<1B",
"language:hi",
"license:apache-2.0",
"region:us"
] | 2024-02-04T19:16:02+00:00 | {"language": ["hi"], "license": "apache-2.0", "size_categories": ["100M<n<1B"], "task_categories": ["text-generation"], "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 80196074466, "num_examples": 106391910}], "download_size": 6800633717, "dataset_size": 80196074466}} | 2024-02-04T21:01:37+00:00 | [] | [
"hi"
] | TAGS
#task_categories-text-generation #size_categories-100M<n<1B #language-Hindi #license-apache-2.0 #region-us
| # Dataset Card for "ai4bharat-hi-subset"
More Information needed | [
"# Dataset Card for \"ai4bharat-hi-subset\"\n\nMore Information needed"
] | [
"TAGS\n#task_categories-text-generation #size_categories-100M<n<1B #language-Hindi #license-apache-2.0 #region-us \n",
"# Dataset Card for \"ai4bharat-hi-subset\"\n\nMore Information needed"
] |
27ce2f607473bdae0a47e22cd21d170113915587 |
# Dataset Card for Evaluation run of nlpguy/Westgate
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nlpguy/Westgate](https://huggingface.co/nlpguy/Westgate) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nlpguy__Westgate",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T19:13:46.734414](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__Westgate/blob/main/results_2024-02-04T19-13-46.734414.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.656811843796802,
"acc_stderr": 0.031938703448252795,
"acc_norm": 0.6560424316624701,
"acc_norm_stderr": 0.032612557879319985,
"mc1": 0.4785801713586291,
"mc1_stderr": 0.017487432144711806,
"mc2": 0.6258706346942783,
"mc2_stderr": 0.015516885259749542
},
"harness|arc:challenge|25": {
"acc": 0.6843003412969283,
"acc_stderr": 0.013582571095815291,
"acc_norm": 0.7141638225255973,
"acc_norm_stderr": 0.013203196088537372
},
"harness|hellaswag|10": {
"acc": 0.7126070503883688,
"acc_stderr": 0.004516215206715354,
"acc_norm": 0.8813981278629756,
"acc_norm_stderr": 0.0032265867834212897
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754406,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266344,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266344
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586808,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.015094215699700476,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.015094215699700476
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553353,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553353
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.0134682016140663,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.0134682016140663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43687150837988825,
"acc_stderr": 0.01658868086453063,
"acc_norm": 0.43687150837988825,
"acc_norm_stderr": 0.01658868086453063
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.02368359183700856,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.02368359183700856
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4771838331160365,
"acc_stderr": 0.0127569333828237,
"acc_norm": 0.4771838331160365,
"acc_norm_stderr": 0.0127569333828237
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.01879808628488689,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.01879808628488689
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160882,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160882
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4785801713586291,
"mc1_stderr": 0.017487432144711806,
"mc2": 0.6258706346942783,
"mc2_stderr": 0.015516885259749542
},
"harness|winogrande|5": {
"acc": 0.8571428571428571,
"acc_stderr": 0.009834691297450127
},
"harness|gsm8k|5": {
"acc": 0.7005307050796058,
"acc_stderr": 0.012616300735519656
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_nlpguy__Westgate | [
"region:us"
] | 2024-02-04T19:16:08+00:00 | {"pretty_name": "Evaluation run of nlpguy/Westgate", "dataset_summary": "Dataset automatically created during the evaluation run of model [nlpguy/Westgate](https://huggingface.co/nlpguy/Westgate) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nlpguy__Westgate\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T19:13:46.734414](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__Westgate/blob/main/results_2024-02-04T19-13-46.734414.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.656811843796802,\n \"acc_stderr\": 0.031938703448252795,\n \"acc_norm\": 0.6560424316624701,\n \"acc_norm_stderr\": 0.032612557879319985,\n \"mc1\": 0.4785801713586291,\n \"mc1_stderr\": 0.017487432144711806,\n \"mc2\": 0.6258706346942783,\n \"mc2_stderr\": 0.015516885259749542\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6843003412969283,\n \"acc_stderr\": 0.013582571095815291,\n \"acc_norm\": 0.7141638225255973,\n \"acc_norm_stderr\": 0.013203196088537372\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7126070503883688,\n \"acc_stderr\": 0.004516215206715354,\n \"acc_norm\": 0.8813981278629756,\n \"acc_norm_stderr\": 0.0032265867834212897\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754406,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754406\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266344,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266344\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8550458715596331,\n \"acc_stderr\": 0.015094215699700476,\n \"acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.015094215699700476\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553353,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553353\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.0134682016140663,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.0134682016140663\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43687150837988825,\n \"acc_stderr\": 0.01658868086453063,\n \"acc_norm\": 0.43687150837988825,\n \"acc_norm_stderr\": 0.01658868086453063\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.02368359183700856,\n \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.02368359183700856\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4771838331160365,\n \"acc_stderr\": 0.0127569333828237,\n \"acc_norm\": 0.4771838331160365,\n \"acc_norm_stderr\": 0.0127569333828237\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.684640522875817,\n \"acc_stderr\": 0.01879808628488689,\n \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.01879808628488689\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160882,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160882\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4785801713586291,\n \"mc1_stderr\": 0.017487432144711806,\n \"mc2\": 0.6258706346942783,\n \"mc2_stderr\": 0.015516885259749542\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8571428571428571,\n \"acc_stderr\": 0.009834691297450127\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7005307050796058,\n \"acc_stderr\": 0.012616300735519656\n }\n}\n```", "repo_url": "https://huggingface.co/nlpguy/Westgate", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|arc:challenge|25_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|gsm8k|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hellaswag|10_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T19-13-46.734414.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["**/details_harness|winogrande|5_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T19-13-46.734414.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T19_13_46.734414", "path": ["results_2024-02-04T19-13-46.734414.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T19-13-46.734414.parquet"]}]}]} | 2024-02-04T19:16:29+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of nlpguy/Westgate
Dataset automatically created during the evaluation run of model nlpguy/Westgate on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T19:13:46.734414(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of nlpguy/Westgate\n\n\n\nDataset automatically created during the evaluation run of model nlpguy/Westgate on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T19:13:46.734414(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of nlpguy/Westgate\n\n\n\nDataset automatically created during the evaluation run of model nlpguy/Westgate on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T19:13:46.734414(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
99e9b6f1a47866379121312d88dca3828beccc9b | # Dataset Card for "hanna-pairwise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | llm-aes/hanna-pairwise | [
"region:us"
] | 2024-02-04T20:41:10+00:00 | {"dataset_info": {"features": [{"name": "output_1", "dtype": "string"}, {"name": "output_2", "dtype": "string"}, {"name": "human_label", "dtype": "int64"}, {"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "generator_1", "dtype": "string"}, {"name": "generator_2", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 15093325, "num_examples": 5280}], "download_size": 1595883, "dataset_size": 15093325}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-11T20:39:57+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "hanna-pairwise"
More Information needed | [
"# Dataset Card for \"hanna-pairwise\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"hanna-pairwise\"\n\nMore Information needed"
] |
21a56e17a10f8bf1b361711d4374358089d02c63 |
# Dataset Card for Evaluation run of gmonsoon/OpenMia-Indo-Engineering
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gmonsoon/OpenMia-Indo-Engineering](https://huggingface.co/gmonsoon/OpenMia-Indo-Engineering) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gmonsoon__OpenMia-Indo-Engineering",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T20:44:41.527501](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__OpenMia-Indo-Engineering/blob/main/results_2024-02-04T20-44-41.527501.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6324060235996815,
"acc_stderr": 0.03231686157638383,
"acc_norm": 0.6331079164519123,
"acc_norm_stderr": 0.032980184119772604,
"mc1": 0.4186046511627907,
"mc1_stderr": 0.01727001528447686,
"mc2": 0.5793947082847677,
"mc2_stderr": 0.01530573457723597
},
"harness|arc:challenge|25": {
"acc": 0.6245733788395904,
"acc_stderr": 0.014150631435111728,
"acc_norm": 0.6715017064846417,
"acc_norm_stderr": 0.0137249784655373
},
"harness|hellaswag|10": {
"acc": 0.6482772356104362,
"acc_stderr": 0.004765320784902126,
"acc_norm": 0.8501294562836088,
"acc_norm_stderr": 0.0035621498909627174
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424649,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424649
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.02805779167298902,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.02805779167298902
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.02439667298509476,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.02439667298509476
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608456,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608456
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.03149930577784906,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.03149930577784906
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010347,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010347
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671632,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671632
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601436,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.03036037971029195,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.03036037971029195
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.01377869377846408,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.01377869377846408
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3027932960893855,
"acc_stderr": 0.015366860386397112,
"acc_norm": 0.3027932960893855,
"acc_norm_stderr": 0.015366860386397112
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279056,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279056
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.02548311560119545,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.02548311560119545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236848,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015062,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015062
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.029227192460032025,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.029227192460032025
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.019542101564854125,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.019542101564854125
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4186046511627907,
"mc1_stderr": 0.01727001528447686,
"mc2": 0.5793947082847677,
"mc2_stderr": 0.01530573457723597
},
"harness|winogrande|5": {
"acc": 0.8232044198895028,
"acc_stderr": 0.010721923287918747
},
"harness|gsm8k|5": {
"acc": 0.6489764973464746,
"acc_stderr": 0.013146945941397222
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_gmonsoon__OpenMia-Indo-Engineering | [
"region:us"
] | 2024-02-04T20:47:02+00:00 | {"pretty_name": "Evaluation run of gmonsoon/OpenMia-Indo-Engineering", "dataset_summary": "Dataset automatically created during the evaluation run of model [gmonsoon/OpenMia-Indo-Engineering](https://huggingface.co/gmonsoon/OpenMia-Indo-Engineering) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gmonsoon__OpenMia-Indo-Engineering\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T20:44:41.527501](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__OpenMia-Indo-Engineering/blob/main/results_2024-02-04T20-44-41.527501.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6324060235996815,\n \"acc_stderr\": 0.03231686157638383,\n \"acc_norm\": 0.6331079164519123,\n \"acc_norm_stderr\": 0.032980184119772604,\n \"mc1\": 0.4186046511627907,\n \"mc1_stderr\": 0.01727001528447686,\n \"mc2\": 0.5793947082847677,\n \"mc2_stderr\": 0.01530573457723597\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111728,\n \"acc_norm\": 0.6715017064846417,\n \"acc_norm_stderr\": 0.0137249784655373\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6482772356104362,\n \"acc_stderr\": 0.004765320784902126,\n \"acc_norm\": 0.8501294562836088,\n \"acc_norm_stderr\": 0.0035621498909627174\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.032650194750335815,\n \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.032650194750335815\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.02805779167298902,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.02805779167298902\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.02439667298509476,\n \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.02439667298509476\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608456,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608456\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.03149930577784906,\n \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.03149930577784906\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010347,\n \"acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010347\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671632,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671632\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601436,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601436\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n \"acc_stderr\": 0.03036037971029195,\n \"acc_norm\": 0.7130044843049327,\n \"acc_norm_stderr\": 0.03036037971029195\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n \"acc_stderr\": 0.01377869377846408,\n \"acc_norm\": 0.8186462324393359,\n \"acc_norm_stderr\": 0.01377869377846408\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3027932960893855,\n \"acc_stderr\": 0.015366860386397112,\n \"acc_norm\": 0.3027932960893855,\n \"acc_norm_stderr\": 0.015366860386397112\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279056,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279056\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.02548311560119545,\n \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.02548311560119545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236848,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236848\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n \"acc_stderr\": 0.012751075788015062,\n \"acc_norm\": 0.4726205997392438,\n \"acc_norm_stderr\": 0.012751075788015062\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.029227192460032025,\n \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.029227192460032025\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854125,\n \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854125\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4186046511627907,\n \"mc1_stderr\": 0.01727001528447686,\n \"mc2\": 0.5793947082847677,\n \"mc2_stderr\": 0.01530573457723597\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.010721923287918747\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6489764973464746,\n \"acc_stderr\": 0.013146945941397222\n }\n}\n```", "repo_url": "https://huggingface.co/gmonsoon/OpenMia-Indo-Engineering", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|arc:challenge|25_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|gsm8k|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hellaswag|10_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T20-44-41.527501.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["**/details_harness|winogrande|5_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T20-44-41.527501.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T20_44_41.527501", "path": ["results_2024-02-04T20-44-41.527501.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T20-44-41.527501.parquet"]}]}]} | 2024-02-04T20:47:29+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of gmonsoon/OpenMia-Indo-Engineering
Dataset automatically created during the evaluation run of model gmonsoon/OpenMia-Indo-Engineering on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T20:44:41.527501(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of gmonsoon/OpenMia-Indo-Engineering\n\n\n\nDataset automatically created during the evaluation run of model gmonsoon/OpenMia-Indo-Engineering on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T20:44:41.527501(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of gmonsoon/OpenMia-Indo-Engineering\n\n\n\nDataset automatically created during the evaluation run of model gmonsoon/OpenMia-Indo-Engineering on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T20:44:41.527501(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
63914832ed0239ed518239ce2b1000724d1a8829 |
# hh-rlhf-12k-ja
This repository provides a human preference dataset developed by LLM-jp, a collaborative project launched in Japan.
This dataset is a Japanese translation of a subset of [hh-rlhf](https://huggingface.co/datasets/Anthropic/hh-rlhf) using DeepL.
This dataset consists of 12,000 entries randomly sampled from hh-rlhf. Specifically, it includes a random selection of 3,000 entries from the training splits of the four groups: harmless-base, helpful-base, helpful-online, and helpful-rejection-sampled. For more information on each group, please refer to the original dataset documentation.
## Send Questions to
llm-jp(at)nii.ac.jp
## Model Card Authors
The names are listed in alphabetical order.
Hirokazu Kiyomaru, Hiroshi Matsuda, Jun Suzuki, Namgi Han, Saku Sugawara, Shota Sasaki, Shuhei Kurita, Taishi Nakamura, Takashi Kodama, Takumi Okamoto. | llm-jp/hh-rlhf-12k-ja | [
"size_categories:10K<n<100K",
"language:ja",
"license:mit",
"region:us"
] | 2024-02-04T21:19:53+00:00 | {"language": ["ja"], "license": "mit", "size_categories": ["10K<n<100K"]} | 2024-02-04T21:45:59+00:00 | [] | [
"ja"
] | TAGS
#size_categories-10K<n<100K #language-Japanese #license-mit #region-us
|
# hh-rlhf-12k-ja
This repository provides a human preference dataset developed by LLM-jp, a collaborative project launched in Japan.
This dataset is a Japanese translation of a subset of hh-rlhf using DeepL.
This dataset consists of 12,000 entries randomly sampled from hh-rlhf. Specifically, it includes a random selection of 3,000 entries from the training splits of the four groups: harmless-base, helpful-base, helpful-online, and helpful-rejection-sampled. For more information on each group, please refer to the original dataset documentation.
## Send Questions to
llm-jp(at)URL
## Model Card Authors
The names are listed in alphabetical order.
Hirokazu Kiyomaru, Hiroshi Matsuda, Jun Suzuki, Namgi Han, Saku Sugawara, Shota Sasaki, Shuhei Kurita, Taishi Nakamura, Takashi Kodama, Takumi Okamoto. | [
"# hh-rlhf-12k-ja\n\nThis repository provides a human preference dataset developed by LLM-jp, a collaborative project launched in Japan.\n\nThis dataset is a Japanese translation of a subset of hh-rlhf using DeepL.\n\nThis dataset consists of 12,000 entries randomly sampled from hh-rlhf. Specifically, it includes a random selection of 3,000 entries from the training splits of the four groups: harmless-base, helpful-base, helpful-online, and helpful-rejection-sampled. For more information on each group, please refer to the original dataset documentation.",
"## Send Questions to\nllm-jp(at)URL",
"## Model Card Authors\nThe names are listed in alphabetical order.\n\nHirokazu Kiyomaru, Hiroshi Matsuda, Jun Suzuki, Namgi Han, Saku Sugawara, Shota Sasaki, Shuhei Kurita, Taishi Nakamura, Takashi Kodama, Takumi Okamoto."
] | [
"TAGS\n#size_categories-10K<n<100K #language-Japanese #license-mit #region-us \n",
"# hh-rlhf-12k-ja\n\nThis repository provides a human preference dataset developed by LLM-jp, a collaborative project launched in Japan.\n\nThis dataset is a Japanese translation of a subset of hh-rlhf using DeepL.\n\nThis dataset consists of 12,000 entries randomly sampled from hh-rlhf. Specifically, it includes a random selection of 3,000 entries from the training splits of the four groups: harmless-base, helpful-base, helpful-online, and helpful-rejection-sampled. For more information on each group, please refer to the original dataset documentation.",
"## Send Questions to\nllm-jp(at)URL",
"## Model Card Authors\nThe names are listed in alphabetical order.\n\nHirokazu Kiyomaru, Hiroshi Matsuda, Jun Suzuki, Namgi Han, Saku Sugawara, Shota Sasaki, Shuhei Kurita, Taishi Nakamura, Takashi Kodama, Takumi Okamoto."
] |
36520e7565136e3e4c175807e54608d58f6a4428 |
# Dataset Card for Finnish-NLP/benebele
## Creation process
- Finnish subset loaded from facebook/belebele | Finnish-NLP/belebele-fi-filtered-sft | [
"task_categories:text-generation",
"task_categories:question-answering",
"language:fi",
"license:cc-by-sa-4.0",
"region:us"
] | 2024-02-04T21:40:28+00:00 | {"language": ["fi"], "license": "cc-by-sa-4.0", "task_categories": ["text-generation", "question-answering"], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "source", "dtype": "null"}, {"name": "text", "dtype": "null"}, {"name": "chosen", "dtype": "null"}, {"name": "rejected", "dtype": "null"}, {"name": "prommpt", "dtype": "null"}, {"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}]}, {"name": "category", "dtype": "string"}, {"name": "conversations_len", "dtype": "int64"}, {"name": "person_1", "dtype": "string"}, {"name": "person_2", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "sample_words", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 414674, "num_examples": 300}], "download_size": 269823, "dataset_size": 414674}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-13T21:32:14+00:00 | [] | [
"fi"
] | TAGS
#task_categories-text-generation #task_categories-question-answering #language-Finnish #license-cc-by-sa-4.0 #region-us
|
# Dataset Card for Finnish-NLP/benebele
## Creation process
- Finnish subset loaded from facebook/belebele | [
"# Dataset Card for Finnish-NLP/benebele",
"## Creation process\n - Finnish subset loaded from facebook/belebele"
] | [
"TAGS\n#task_categories-text-generation #task_categories-question-answering #language-Finnish #license-cc-by-sa-4.0 #region-us \n",
"# Dataset Card for Finnish-NLP/benebele",
"## Creation process\n - Finnish subset loaded from facebook/belebele"
] |
befd310eddff17f0b8088fdb57cedc1811c9fc60 |
# Restful Booker Platform Data Set
Scrapped code from the Restful Booker Platform code base found here:
[https://github.com/mwinteringham/restful-booker-platform](https://github.com/mwinteringham/restful-booker-platform)
| 2bittester/rbp-data-set | [
"language:eng",
"license:mit",
"Scrapped Code",
"region:us"
] | 2024-02-04T21:57:15+00:00 | {"language": ["eng"], "license": "mit", "pretty_name": "RBP Data Set", "tags": ["Scrapped Code"]} | 2024-02-07T21:15:46+00:00 | [] | [
"eng"
] | TAGS
#language-English #license-mit #Scrapped Code #region-us
|
# Restful Booker Platform Data Set
Scrapped code from the Restful Booker Platform code base found here:
URL
| [
"# Restful Booker Platform Data Set\n\nScrapped code from the Restful Booker Platform code base found here:\n\nURL"
] | [
"TAGS\n#language-English #license-mit #Scrapped Code #region-us \n",
"# Restful Booker Platform Data Set\n\nScrapped code from the Restful Booker Platform code base found here:\n\nURL"
] |
5e682bcb2fab62a5c132769d2675ee46e9bac61b |
# Alpaca-kartuli-0.1
<!-- Provide a quick summary of the dataset. -->
alpaca Dataset-ის ქართულად გადმოთრგმნილი ვერსია.
- **წყარო:** [alpaca-cleaned](https://huggingface.co/datasets/yahma/alpaca-cleaned#dataset-card-for-alpaca-cleaned)
| Temo/alpaca-kartuli-0.1 | [
"size_categories:10K<n<100K",
"language:ka",
"license:cc-by-4.0",
"region:us"
] | 2024-02-04T21:59:37+00:00 | {"language": ["ka"], "license": "cc-by-4.0", "size_categories": ["10K<n<100K"]} | 2024-02-04T22:45:59+00:00 | [] | [
"ka"
] | TAGS
#size_categories-10K<n<100K #language-Georgian #license-cc-by-4.0 #region-us
|
# Alpaca-kartuli-0.1
alpaca Dataset-ის ქართულად გადმოთრგმნილი ვერსია.
- წყარო: alpaca-cleaned
| [
"# Alpaca-kartuli-0.1\n\n\n\nalpaca Dataset-ის ქართულად გადმოთრგმნილი ვერსია.\n\n- წყარო: alpaca-cleaned"
] | [
"TAGS\n#size_categories-10K<n<100K #language-Georgian #license-cc-by-4.0 #region-us \n",
"# Alpaca-kartuli-0.1\n\n\n\nalpaca Dataset-ის ქართულად გადმოთრგმნილი ვერსია.\n\n- წყარო: alpaca-cleaned"
] |
fa3732bc0879e011048168e50b79d62721dca58e |
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:** https://gitlab.inria.fr/semagramme-public-projects/resources/french-fracas
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This repository contains the French version of the FraCaS Test Suite introduced in [this paper](https://aclanthology.org/2020.lrec-1.721.pdf), as well as the original English one, in a TSV format (as opposed to the XML format provided with the original paper).
FraCaS stands for "Framework for Computational Semantics".
### Supported Tasks and Leaderboards
This dataset can be used for the task of Natural Language Inference (NLI), also known as Recognizing Textual Entailment (RTE), which is a sentence-pair classification task.
It can also be used for the task of Question Answering (QA) (when using the columns `question` and `answer` instead of `hypothesis` and `label`).
## Dataset Structure
### Data Fields
- `id`: Index number.
- `premises`: All premises provided for this particular example, concatenated, in French.
- `hypothesis`: The translated hypothesis in the target language (French).
- `label`: The classification label, with possible values 0 (`entailment`), 1 (`neutral`), 2 (`contradiction`), or undef (for undefined).
- `question`: The hypothesis in the form of question, in French.
- `answer`: The answer to the question, with possible values `Yes` (0), `Don't know` / `Unknown` (1), `No` (2), `undef`, or a longer phrase containing qualifications or elaborations such as `Yes, on one reading`.
- `premises_original`: All premises provided for this particular example, concatenated, in their language of origin (English).
- `premise1`: The first premise in English.
- `premise1_original`: The first premise in English.
- `premise2`: When available, the second premise in French.
- `premise2_original`: When available, the second premise in English.
- `premise3`: When available, the third premise in French.
- `premise3_original`: When available, the third premise in English.
- `premise4`: When available, the fourth premise in French.
- `premise4_original`: When available, the fourth premise in English.
- `premise5`: When available, the fifth premise in French.
- `premise5_original`: When available, the fifth premise in English.
- `hypothesis_original`: The hypothesis in English.
- `question_original`: The hypothesis in the form of question, in English.
- `note`: Text from the source document intended to explain or justify the answer, or notes added to a number of problems in order to explain issues which arose during translation.
- `topic`: Problem set / topic.
### Data Splits
The premise counts are distributed as follows:
| # premises |# problems|% problems|
|---------:|------:|------------:|
| 1 | 192 | 55.5 % |
| 2 | 122 | 35.3 % |
| 3 | 29 | 8.4 % |
| 4 | 2 | 0.6 % |
| 5 | 1 | 0.3 % |
The answer distribution is roughly as follows:
| # problems |Percentage|Answer|
|---------:|------:|------------:|
| 180 | 52% | Yes |
| 94 | 27% | Don't know |
| 31 | 9% | No |
| 41 | 12% | [other / complex] |
Here's the breakdown by topic:
| sec | topic| start| count|%|single-premise|
|-------------|--:|--:|--:|--:|--:|
| 1 |Quantifiers|1|80|23 %|50|
| 2 |Plurals|81|33|10 %|24|
| 3 |Anaphora|114|28| 8 %|6|
| 4 |Ellipsis|142|55|16 %|25|
| 5 |Adjectives|197|23|7 %|15|
| 6 |Comparatives|220|31|9 %|16|
| 7 |Temporal|251|75|22 %|39|
| 8 |Verbs|326|8|2 %|8|
| 9 |Attitudes|334|4|10 %|9|
## Additional Information
### Citation Information
**BibTeX:**
````BibTeX
@inproceedings{amblard-etal-2020-french,
title = "A {F}rench Version of the {F}ra{C}a{S} Test Suite",
author = "Amblard, Maxime and
Beysson, Cl{\'e}ment and
de Groote, Philippe and
Guillaume, Bruno and
Pogodalla, Sylvain",
editor = "Calzolari, Nicoletta and
B{\'e}chet, Fr{\'e}d{\'e}ric and
Blache, Philippe and
Choukri, Khalid and
Cieri, Christopher and
Declerck, Thierry and
Goggi, Sara and
Isahara, Hitoshi and
Maegaard, Bente and
Mariani, Joseph and
Mazo, H{\'e}l{\`e}ne and
Moreno, Asuncion and
Odijk, Jan and
Piperidis, Stelios",
booktitle = "Proceedings of the Twelfth Language Resources and Evaluation Conference",
month = may,
year = "2020",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2020.lrec-1.721",
pages = "5887--5895",
abstract = "This paper presents a French version of the FraCaS test suite. This test suite, originally written in English, contains problems illustrating semantic inference in natural language. We describe linguistic choices we had to make when translating the FraCaS test suite in French, and discuss some of the issues that were raised by the translation. We also report an experiment we ran in order to test both the translation and the logical semantics underlying the problems of the test suite. This provides a way of checking formal semanticists{'} hypotheses against actual semantic capacity of speakers (in the present case, French speakers), and allow us to compare the results we obtained with the ones of similar experiments that have been conducted for other languages.",
language = "English",
ISBN = "979-10-95546-34-4",
}
````
**ACL:**
Maxime Amblard, Clément Beysson, Philippe de Groote, Bruno Guillaume, and Sylvain Pogodalla. 2020. [A French Version of the FraCaS Test Suite](https://aclanthology.org/2020.lrec-1.721/). In *Proceedings of the Twelfth Language Resources and Evaluation Conference*, pages 5887–5895, Marseille, France. European Language Resources Association. | maximoss/fracas | [
"task_categories:text-classification",
"task_categories:question-answering",
"task_ids:natural-language-inference",
"task_ids:multi-input-text-classification",
"size_categories:n<1K",
"language:fr",
"language:en",
"license:cc-by-nc-sa-4.0",
"region:us"
] | 2024-02-04T22:16:45+00:00 | {"language": ["fr", "en"], "license": "cc-by-nc-sa-4.0", "size_categories": ["n<1K"], "task_categories": ["text-classification", "question-answering"], "task_ids": ["natural-language-inference", "multi-input-text-classification"]} | 2024-02-05T15:49:23+00:00 | [] | [
"fr",
"en"
] | TAGS
#task_categories-text-classification #task_categories-question-answering #task_ids-natural-language-inference #task_ids-multi-input-text-classification #size_categories-n<1K #language-French #language-English #license-cc-by-nc-sa-4.0 #region-us
| Dataset Card for Dataset Name
=============================
Dataset Description
-------------------
* Homepage:
* Repository: URL
* Paper:
* Leaderboard:
* Point of Contact:
### Dataset Summary
This repository contains the French version of the FraCaS Test Suite introduced in this paper, as well as the original English one, in a TSV format (as opposed to the XML format provided with the original paper).
FraCaS stands for "Framework for Computational Semantics".
### Supported Tasks and Leaderboards
This dataset can be used for the task of Natural Language Inference (NLI), also known as Recognizing Textual Entailment (RTE), which is a sentence-pair classification task.
It can also be used for the task of Question Answering (QA) (when using the columns 'question' and 'answer' instead of 'hypothesis' and 'label').
Dataset Structure
-----------------
### Data Fields
* 'id': Index number.
* 'premises': All premises provided for this particular example, concatenated, in French.
* 'hypothesis': The translated hypothesis in the target language (French).
* 'label': The classification label, with possible values 0 ('entailment'), 1 ('neutral'), 2 ('contradiction'), or undef (for undefined).
* 'question': The hypothesis in the form of question, in French.
* 'answer': The answer to the question, with possible values 'Yes' (0), 'Don't know' / 'Unknown' (1), 'No' (2), 'undef', or a longer phrase containing qualifications or elaborations such as 'Yes, on one reading'.
* 'premises\_original': All premises provided for this particular example, concatenated, in their language of origin (English).
* 'premise1': The first premise in English.
* 'premise1\_original': The first premise in English.
* 'premise2': When available, the second premise in French.
* 'premise2\_original': When available, the second premise in English.
* 'premise3': When available, the third premise in French.
* 'premise3\_original': When available, the third premise in English.
* 'premise4': When available, the fourth premise in French.
* 'premise4\_original': When available, the fourth premise in English.
* 'premise5': When available, the fifth premise in French.
* 'premise5\_original': When available, the fifth premise in English.
* 'hypothesis\_original': The hypothesis in English.
* 'question\_original': The hypothesis in the form of question, in English.
* 'note': Text from the source document intended to explain or justify the answer, or notes added to a number of problems in order to explain issues which arose during translation.
* 'topic': Problem set / topic.
### Data Splits
The premise counts are distributed as follows:
The answer distribution is roughly as follows:
Here's the breakdown by topic:
Additional Information
----------------------
BibTeX:
'
ACL:
Maxime Amblard, Clément Beysson, Philippe de Groote, Bruno Guillaume, and Sylvain Pogodalla. 2020. A French Version of the FraCaS Test Suite. In *Proceedings of the Twelfth Language Resources and Evaluation Conference*, pages 5887–5895, Marseille, France. European Language Resources Association.
| [
"### Dataset Summary\n\n\nThis repository contains the French version of the FraCaS Test Suite introduced in this paper, as well as the original English one, in a TSV format (as opposed to the XML format provided with the original paper).\n\n\nFraCaS stands for \"Framework for Computational Semantics\".",
"### Supported Tasks and Leaderboards\n\n\nThis dataset can be used for the task of Natural Language Inference (NLI), also known as Recognizing Textual Entailment (RTE), which is a sentence-pair classification task.\n\n\nIt can also be used for the task of Question Answering (QA) (when using the columns 'question' and 'answer' instead of 'hypothesis' and 'label').\n\n\nDataset Structure\n-----------------",
"### Data Fields\n\n\n* 'id': Index number.\n* 'premises': All premises provided for this particular example, concatenated, in French.\n* 'hypothesis': The translated hypothesis in the target language (French).\n* 'label': The classification label, with possible values 0 ('entailment'), 1 ('neutral'), 2 ('contradiction'), or undef (for undefined).\n* 'question': The hypothesis in the form of question, in French.\n* 'answer': The answer to the question, with possible values 'Yes' (0), 'Don't know' / 'Unknown' (1), 'No' (2), 'undef', or a longer phrase containing qualifications or elaborations such as 'Yes, on one reading'.\n* 'premises\\_original': All premises provided for this particular example, concatenated, in their language of origin (English).\n* 'premise1': The first premise in English.\n* 'premise1\\_original': The first premise in English.\n* 'premise2': When available, the second premise in French.\n* 'premise2\\_original': When available, the second premise in English.\n* 'premise3': When available, the third premise in French.\n* 'premise3\\_original': When available, the third premise in English.\n* 'premise4': When available, the fourth premise in French.\n* 'premise4\\_original': When available, the fourth premise in English.\n* 'premise5': When available, the fifth premise in French.\n* 'premise5\\_original': When available, the fifth premise in English.\n* 'hypothesis\\_original': The hypothesis in English.\n* 'question\\_original': The hypothesis in the form of question, in English.\n* 'note': Text from the source document intended to explain or justify the answer, or notes added to a number of problems in order to explain issues which arose during translation.\n* 'topic': Problem set / topic.",
"### Data Splits\n\n\nThe premise counts are distributed as follows:\n\n\n\nThe answer distribution is roughly as follows:\n\n\n\nHere's the breakdown by topic:\n\n\n\nAdditional Information\n----------------------\n\n\nBibTeX:\n\n\n'\n\n\nACL:\n\n\nMaxime Amblard, Clément Beysson, Philippe de Groote, Bruno Guillaume, and Sylvain Pogodalla. 2020. A French Version of the FraCaS Test Suite. In *Proceedings of the Twelfth Language Resources and Evaluation Conference*, pages 5887–5895, Marseille, France. European Language Resources Association."
] | [
"TAGS\n#task_categories-text-classification #task_categories-question-answering #task_ids-natural-language-inference #task_ids-multi-input-text-classification #size_categories-n<1K #language-French #language-English #license-cc-by-nc-sa-4.0 #region-us \n",
"### Dataset Summary\n\n\nThis repository contains the French version of the FraCaS Test Suite introduced in this paper, as well as the original English one, in a TSV format (as opposed to the XML format provided with the original paper).\n\n\nFraCaS stands for \"Framework for Computational Semantics\".",
"### Supported Tasks and Leaderboards\n\n\nThis dataset can be used for the task of Natural Language Inference (NLI), also known as Recognizing Textual Entailment (RTE), which is a sentence-pair classification task.\n\n\nIt can also be used for the task of Question Answering (QA) (when using the columns 'question' and 'answer' instead of 'hypothesis' and 'label').\n\n\nDataset Structure\n-----------------",
"### Data Fields\n\n\n* 'id': Index number.\n* 'premises': All premises provided for this particular example, concatenated, in French.\n* 'hypothesis': The translated hypothesis in the target language (French).\n* 'label': The classification label, with possible values 0 ('entailment'), 1 ('neutral'), 2 ('contradiction'), or undef (for undefined).\n* 'question': The hypothesis in the form of question, in French.\n* 'answer': The answer to the question, with possible values 'Yes' (0), 'Don't know' / 'Unknown' (1), 'No' (2), 'undef', or a longer phrase containing qualifications or elaborations such as 'Yes, on one reading'.\n* 'premises\\_original': All premises provided for this particular example, concatenated, in their language of origin (English).\n* 'premise1': The first premise in English.\n* 'premise1\\_original': The first premise in English.\n* 'premise2': When available, the second premise in French.\n* 'premise2\\_original': When available, the second premise in English.\n* 'premise3': When available, the third premise in French.\n* 'premise3\\_original': When available, the third premise in English.\n* 'premise4': When available, the fourth premise in French.\n* 'premise4\\_original': When available, the fourth premise in English.\n* 'premise5': When available, the fifth premise in French.\n* 'premise5\\_original': When available, the fifth premise in English.\n* 'hypothesis\\_original': The hypothesis in English.\n* 'question\\_original': The hypothesis in the form of question, in English.\n* 'note': Text from the source document intended to explain or justify the answer, or notes added to a number of problems in order to explain issues which arose during translation.\n* 'topic': Problem set / topic.",
"### Data Splits\n\n\nThe premise counts are distributed as follows:\n\n\n\nThe answer distribution is roughly as follows:\n\n\n\nHere's the breakdown by topic:\n\n\n\nAdditional Information\n----------------------\n\n\nBibTeX:\n\n\n'\n\n\nACL:\n\n\nMaxime Amblard, Clément Beysson, Philippe de Groote, Bruno Guillaume, and Sylvain Pogodalla. 2020. A French Version of the FraCaS Test Suite. In *Proceedings of the Twelfth Language Resources and Evaluation Conference*, pages 5887–5895, Marseille, France. European Language Resources Association."
] |
912a85007cece1ccd20b7b6224e973ada57c2bd4 |
# Dataset Card for Evaluation run of AA051615/A0204
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051615/A0204](https://huggingface.co/AA051615/A0204) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051615__A0204",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T22:24:08.088490](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051615__A0204/blob/main/results_2024-02-04T22-24-08.088490.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.8557918259619507,
"acc_stderr": 0.022903287117849617,
"acc_norm": 0.8653459094330703,
"acc_norm_stderr": 0.02323340239827946,
"mc1": 0.4039167686658507,
"mc1_stderr": 0.01717727682258428,
"mc2": 0.57938549088033,
"mc2_stderr": 0.0155079218690995
},
"harness|arc:challenge|25": {
"acc": 0.6544368600682594,
"acc_stderr": 0.01389693846114568,
"acc_norm": 0.7030716723549488,
"acc_norm_stderr": 0.01335202597672522
},
"harness|hellaswag|10": {
"acc": 0.6471818362875921,
"acc_stderr": 0.004768701562988872,
"acc_norm": 0.8441545508862777,
"acc_norm_stderr": 0.0036196748640350157
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.8592592592592593,
"acc_stderr": 0.030041362609516897,
"acc_norm": 0.8592592592592593,
"acc_norm_stderr": 0.030041362609516897
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9144736842105263,
"acc_stderr": 0.022758677130888604,
"acc_norm": 0.9144736842105263,
"acc_norm_stderr": 0.022758677130888604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8981132075471698,
"acc_stderr": 0.01861754975827668,
"acc_norm": 0.8981132075471698,
"acc_norm_stderr": 0.01861754975827668
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9861111111111112,
"acc_stderr": 0.00978652836519694,
"acc_norm": 0.9861111111111112,
"acc_norm_stderr": 0.00978652836519694
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.861271676300578,
"acc_stderr": 0.02635654191584046,
"acc_norm": 0.861271676300578,
"acc_norm_stderr": 0.02635654191584046
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.0465501041131961,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.0465501041131961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.8978723404255319,
"acc_stderr": 0.019795708842206803,
"acc_norm": 0.8978723404255319,
"acc_norm_stderr": 0.019795708842206803
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.903448275862069,
"acc_stderr": 0.024612198971682625,
"acc_norm": 0.903448275862069,
"acc_norm_stderr": 0.024612198971682625
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.8465608465608465,
"acc_stderr": 0.018562074482688474,
"acc_norm": 0.8465608465608465,
"acc_norm_stderr": 0.018562074482688474
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.6507936507936508,
"acc_stderr": 0.04263906892795131,
"acc_norm": 0.6507936507936508,
"acc_norm_stderr": 0.04263906892795131
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9451612903225807,
"acc_stderr": 0.012951418509899199,
"acc_norm": 0.9451612903225807,
"acc_norm_stderr": 0.012951418509899199
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.8472906403940886,
"acc_stderr": 0.025308904539380637,
"acc_norm": 0.8472906403940886,
"acc_norm_stderr": 0.025308904539380637
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.9333333333333333,
"acc_stderr": 0.019478290326359282,
"acc_norm": 0.9333333333333333,
"acc_norm_stderr": 0.019478290326359282
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9696969696969697,
"acc_stderr": 0.012213156893572809,
"acc_norm": 0.9696969696969697,
"acc_norm_stderr": 0.012213156893572809
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9896373056994818,
"acc_stderr": 0.007308424386792219,
"acc_norm": 0.9896373056994818,
"acc_norm_stderr": 0.007308424386792219
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8897435897435897,
"acc_stderr": 0.01588033126105611,
"acc_norm": 0.8897435897435897,
"acc_norm_stderr": 0.01588033126105611
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.026962424325073838,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.026962424325073838
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.9243697478991597,
"acc_stderr": 0.017174988814938515,
"acc_norm": 0.9243697478991597,
"acc_norm_stderr": 0.017174988814938515
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.7086092715231788,
"acc_stderr": 0.037101857261199966,
"acc_norm": 0.7086092715231788,
"acc_norm_stderr": 0.037101857261199966
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9596330275229358,
"acc_stderr": 0.008438519002748255,
"acc_norm": 0.9596330275229358,
"acc_norm_stderr": 0.008438519002748255
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.027467401804057993,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.027467401804057993
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9901960784313726,
"acc_stderr": 0.006915323418523288,
"acc_norm": 0.9901960784313726,
"acc_norm_stderr": 0.006915323418523288
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9493670886075949,
"acc_stderr": 0.014271760025370185,
"acc_norm": 0.9493670886075949,
"acc_norm_stderr": 0.014271760025370185
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.9013452914798207,
"acc_stderr": 0.020013729184919227,
"acc_norm": 0.9013452914798207,
"acc_norm_stderr": 0.020013729184919227
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.9083969465648855,
"acc_stderr": 0.025300035578642962,
"acc_norm": 0.9083969465648855,
"acc_norm_stderr": 0.025300035578642962
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9669421487603306,
"acc_stderr": 0.016321006329034302,
"acc_norm": 0.9669421487603306,
"acc_norm_stderr": 0.016321006329034302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.9351851851851852,
"acc_stderr": 0.023800937426629216,
"acc_norm": 0.9351851851851852,
"acc_norm_stderr": 0.023800937426629216
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.9631901840490797,
"acc_stderr": 0.014793820323252032,
"acc_norm": 0.9631901840490797,
"acc_norm_stderr": 0.014793820323252032
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.7321428571428571,
"acc_stderr": 0.04203277291467763,
"acc_norm": 0.7321428571428571,
"acc_norm_stderr": 0.04203277291467763
},
"harness|hendrycksTest-management|5": {
"acc": 0.9029126213592233,
"acc_stderr": 0.02931596291881348,
"acc_norm": 0.9029126213592233,
"acc_norm_stderr": 0.02931596291881348
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9743589743589743,
"acc_stderr": 0.010354979197709014,
"acc_norm": 0.9743589743589743,
"acc_norm_stderr": 0.010354979197709014
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776348,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776348
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9425287356321839,
"acc_stderr": 0.008322796947412078,
"acc_norm": 0.9425287356321839,
"acc_norm_stderr": 0.008322796947412078
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8728323699421965,
"acc_stderr": 0.017936766865149886,
"acc_norm": 0.8728323699421965,
"acc_norm_stderr": 0.017936766865149886
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.841340782122905,
"acc_stderr": 0.01221939954934151,
"acc_norm": 0.841340782122905,
"acc_norm_stderr": 0.01221939954934151
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.9052287581699346,
"acc_stderr": 0.016771331271836467,
"acc_norm": 0.9052287581699346,
"acc_norm_stderr": 0.016771331271836467
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.887459807073955,
"acc_stderr": 0.017949292186800664,
"acc_norm": 0.887459807073955,
"acc_norm_stderr": 0.017949292186800664
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.904320987654321,
"acc_stderr": 0.016366973744175263,
"acc_norm": 0.904320987654321,
"acc_norm_stderr": 0.016366973744175263
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.7553191489361702,
"acc_stderr": 0.02564555362226673,
"acc_norm": 0.7553191489361702,
"acc_norm_stderr": 0.02564555362226673
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.8252933507170795,
"acc_stderr": 0.009698125789145208,
"acc_norm": 0.8252933507170795,
"acc_norm_stderr": 0.009698125789145208
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.9301470588235294,
"acc_stderr": 0.01548401244105634,
"acc_norm": 0.9301470588235294,
"acc_norm_stderr": 0.01548401244105634
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.9003267973856209,
"acc_stderr": 0.012119053136608476,
"acc_norm": 0.9003267973856209,
"acc_norm_stderr": 0.012119053136608476
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.036942843353378024,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.036942843353378024
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8979591836734694,
"acc_stderr": 0.01937850847450596,
"acc_norm": 0.8979591836734694,
"acc_norm_stderr": 0.01937850847450596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.945273631840796,
"acc_stderr": 0.016082815796263243,
"acc_norm": 0.945273631840796,
"acc_norm_stderr": 0.016082815796263243
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.97,
"acc_stderr": 0.01714466079977652,
"acc_norm": 0.97,
"acc_norm_stderr": 0.01714466079977652
},
"harness|hendrycksTest-virology|5": {
"acc": 0.6746987951807228,
"acc_stderr": 0.03647168523683226,
"acc_norm": 0.6746987951807228,
"acc_norm_stderr": 0.03647168523683226
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9298245614035088,
"acc_stderr": 0.019591541754525123,
"acc_norm": 0.9298245614035088,
"acc_norm_stderr": 0.019591541754525123
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4039167686658507,
"mc1_stderr": 0.01717727682258428,
"mc2": 0.57938549088033,
"mc2_stderr": 0.0155079218690995
},
"harness|winogrande|5": {
"acc": 0.8082083662194159,
"acc_stderr": 0.011065209664659527
},
"harness|gsm8k|5": {
"acc": 0.5852918877937832,
"acc_stderr": 0.013570623842304511
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AA051615__A0204 | [
"region:us"
] | 2024-02-04T22:26:22+00:00 | {"pretty_name": "Evaluation run of AA051615/A0204", "dataset_summary": "Dataset automatically created during the evaluation run of model [AA051615/A0204](https://huggingface.co/AA051615/A0204) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051615__A0204\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T22:24:08.088490](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051615__A0204/blob/main/results_2024-02-04T22-24-08.088490.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.8557918259619507,\n \"acc_stderr\": 0.022903287117849617,\n \"acc_norm\": 0.8653459094330703,\n \"acc_norm_stderr\": 0.02323340239827946,\n \"mc1\": 0.4039167686658507,\n \"mc1_stderr\": 0.01717727682258428,\n \"mc2\": 0.57938549088033,\n \"mc2_stderr\": 0.0155079218690995\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6544368600682594,\n \"acc_stderr\": 0.01389693846114568,\n \"acc_norm\": 0.7030716723549488,\n \"acc_norm_stderr\": 0.01335202597672522\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6471818362875921,\n \"acc_stderr\": 0.004768701562988872,\n \"acc_norm\": 0.8441545508862777,\n \"acc_norm_stderr\": 0.0036196748640350157\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.8592592592592593,\n \"acc_stderr\": 0.030041362609516897,\n \"acc_norm\": 0.8592592592592593,\n \"acc_norm_stderr\": 0.030041362609516897\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.9144736842105263,\n \"acc_stderr\": 0.022758677130888604,\n \"acc_norm\": 0.9144736842105263,\n \"acc_norm_stderr\": 0.022758677130888604\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8981132075471698,\n \"acc_stderr\": 0.01861754975827668,\n \"acc_norm\": 0.8981132075471698,\n \"acc_norm_stderr\": 0.01861754975827668\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9861111111111112,\n \"acc_stderr\": 0.00978652836519694,\n \"acc_norm\": 0.9861111111111112,\n \"acc_norm_stderr\": 0.00978652836519694\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.861271676300578,\n \"acc_stderr\": 0.02635654191584046,\n \"acc_norm\": 0.861271676300578,\n \"acc_norm_stderr\": 0.02635654191584046\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0465501041131961,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0465501041131961\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.8978723404255319,\n \"acc_stderr\": 0.019795708842206803,\n \"acc_norm\": 0.8978723404255319,\n \"acc_norm_stderr\": 0.019795708842206803\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.903448275862069,\n \"acc_stderr\": 0.024612198971682625,\n \"acc_norm\": 0.903448275862069,\n \"acc_norm_stderr\": 0.024612198971682625\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.8465608465608465,\n \"acc_stderr\": 0.018562074482688474,\n \"acc_norm\": 0.8465608465608465,\n \"acc_norm_stderr\": 0.018562074482688474\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6507936507936508,\n \"acc_stderr\": 0.04263906892795131,\n \"acc_norm\": 0.6507936507936508,\n \"acc_norm_stderr\": 0.04263906892795131\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9451612903225807,\n \"acc_stderr\": 0.012951418509899199,\n \"acc_norm\": 0.9451612903225807,\n \"acc_norm_stderr\": 0.012951418509899199\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.8472906403940886,\n \"acc_stderr\": 0.025308904539380637,\n \"acc_norm\": 0.8472906403940886,\n \"acc_norm_stderr\": 0.025308904539380637\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.9333333333333333,\n \"acc_stderr\": 0.019478290326359282,\n \"acc_norm\": 0.9333333333333333,\n \"acc_norm_stderr\": 0.019478290326359282\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9696969696969697,\n \"acc_stderr\": 0.012213156893572809,\n \"acc_norm\": 0.9696969696969697,\n \"acc_norm_stderr\": 0.012213156893572809\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9896373056994818,\n \"acc_stderr\": 0.007308424386792219,\n \"acc_norm\": 0.9896373056994818,\n \"acc_norm_stderr\": 0.007308424386792219\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8897435897435897,\n \"acc_stderr\": 0.01588033126105611,\n \"acc_norm\": 0.8897435897435897,\n \"acc_norm_stderr\": 0.01588033126105611\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.026962424325073838,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.026962424325073838\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.9243697478991597,\n \"acc_stderr\": 0.017174988814938515,\n \"acc_norm\": 0.9243697478991597,\n \"acc_norm_stderr\": 0.017174988814938515\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.7086092715231788,\n \"acc_stderr\": 0.037101857261199966,\n \"acc_norm\": 0.7086092715231788,\n \"acc_norm_stderr\": 0.037101857261199966\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9596330275229358,\n \"acc_stderr\": 0.008438519002748255,\n \"acc_norm\": 0.9596330275229358,\n \"acc_norm_stderr\": 0.008438519002748255\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.027467401804057993,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.027467401804057993\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9901960784313726,\n \"acc_stderr\": 0.006915323418523288,\n \"acc_norm\": 0.9901960784313726,\n \"acc_norm_stderr\": 0.006915323418523288\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9493670886075949,\n \"acc_stderr\": 0.014271760025370185,\n \"acc_norm\": 0.9493670886075949,\n \"acc_norm_stderr\": 0.014271760025370185\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.9013452914798207,\n \"acc_stderr\": 0.020013729184919227,\n \"acc_norm\": 0.9013452914798207,\n \"acc_norm_stderr\": 0.020013729184919227\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.9083969465648855,\n \"acc_stderr\": 0.025300035578642962,\n \"acc_norm\": 0.9083969465648855,\n \"acc_norm_stderr\": 0.025300035578642962\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9669421487603306,\n \"acc_stderr\": 0.016321006329034302,\n \"acc_norm\": 0.9669421487603306,\n \"acc_norm_stderr\": 0.016321006329034302\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.9351851851851852,\n \"acc_stderr\": 0.023800937426629216,\n \"acc_norm\": 0.9351851851851852,\n \"acc_norm_stderr\": 0.023800937426629216\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.9631901840490797,\n \"acc_stderr\": 0.014793820323252032,\n \"acc_norm\": 0.9631901840490797,\n \"acc_norm_stderr\": 0.014793820323252032\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.7321428571428571,\n \"acc_stderr\": 0.04203277291467763,\n \"acc_norm\": 0.7321428571428571,\n \"acc_norm_stderr\": 0.04203277291467763\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9029126213592233,\n \"acc_stderr\": 0.02931596291881348,\n \"acc_norm\": 0.9029126213592233,\n \"acc_norm_stderr\": 0.02931596291881348\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9743589743589743,\n \"acc_stderr\": 0.010354979197709014,\n \"acc_norm\": 0.9743589743589743,\n \"acc_norm_stderr\": 0.010354979197709014\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776348,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776348\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9425287356321839,\n \"acc_stderr\": 0.008322796947412078,\n \"acc_norm\": 0.9425287356321839,\n \"acc_norm_stderr\": 0.008322796947412078\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8728323699421965,\n \"acc_stderr\": 0.017936766865149886,\n \"acc_norm\": 0.8728323699421965,\n \"acc_norm_stderr\": 0.017936766865149886\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.841340782122905,\n \"acc_stderr\": 0.01221939954934151,\n \"acc_norm\": 0.841340782122905,\n \"acc_norm_stderr\": 0.01221939954934151\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.9052287581699346,\n \"acc_stderr\": 0.016771331271836467,\n \"acc_norm\": 0.9052287581699346,\n \"acc_norm_stderr\": 0.016771331271836467\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.887459807073955,\n \"acc_stderr\": 0.017949292186800664,\n \"acc_norm\": 0.887459807073955,\n \"acc_norm_stderr\": 0.017949292186800664\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.904320987654321,\n \"acc_stderr\": 0.016366973744175263,\n \"acc_norm\": 0.904320987654321,\n \"acc_norm_stderr\": 0.016366973744175263\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.7553191489361702,\n \"acc_stderr\": 0.02564555362226673,\n \"acc_norm\": 0.7553191489361702,\n \"acc_norm_stderr\": 0.02564555362226673\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.8252933507170795,\n \"acc_stderr\": 0.009698125789145208,\n \"acc_norm\": 0.8252933507170795,\n \"acc_norm_stderr\": 0.009698125789145208\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.9301470588235294,\n \"acc_stderr\": 0.01548401244105634,\n \"acc_norm\": 0.9301470588235294,\n \"acc_norm_stderr\": 0.01548401244105634\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.9003267973856209,\n \"acc_stderr\": 0.012119053136608476,\n \"acc_norm\": 0.9003267973856209,\n \"acc_norm_stderr\": 0.012119053136608476\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.036942843353378024,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.036942843353378024\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8979591836734694,\n \"acc_stderr\": 0.01937850847450596,\n \"acc_norm\": 0.8979591836734694,\n \"acc_norm_stderr\": 0.01937850847450596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.945273631840796,\n \"acc_stderr\": 0.016082815796263243,\n \"acc_norm\": 0.945273631840796,\n \"acc_norm_stderr\": 0.016082815796263243\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.97,\n \"acc_stderr\": 0.01714466079977652,\n \"acc_norm\": 0.97,\n \"acc_norm_stderr\": 0.01714466079977652\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.6746987951807228,\n \"acc_stderr\": 0.03647168523683226,\n \"acc_norm\": 0.6746987951807228,\n \"acc_norm_stderr\": 0.03647168523683226\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.9298245614035088,\n \"acc_stderr\": 0.019591541754525123,\n \"acc_norm\": 0.9298245614035088,\n \"acc_norm_stderr\": 0.019591541754525123\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4039167686658507,\n \"mc1_stderr\": 0.01717727682258428,\n \"mc2\": 0.57938549088033,\n \"mc2_stderr\": 0.0155079218690995\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8082083662194159,\n \"acc_stderr\": 0.011065209664659527\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5852918877937832,\n \"acc_stderr\": 0.013570623842304511\n }\n}\n```", "repo_url": "https://huggingface.co/AA051615/A0204", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|arc:challenge|25_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|gsm8k|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hellaswag|10_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T22-24-08.088490.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["**/details_harness|winogrande|5_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T22-24-08.088490.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T22_24_08.088490", "path": ["results_2024-02-04T22-24-08.088490.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T22-24-08.088490.parquet"]}]}]} | 2024-02-04T22:26:46+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AA051615/A0204
Dataset automatically created during the evaluation run of model AA051615/A0204 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T22:24:08.088490(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AA051615/A0204\n\n\n\nDataset automatically created during the evaluation run of model AA051615/A0204 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T22:24:08.088490(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AA051615/A0204\n\n\n\nDataset automatically created during the evaluation run of model AA051615/A0204 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T22:24:08.088490(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
6e64316f3b02e7e6d609d8cbfbc508bfb38c1a3c |
# Dataset Card for Evaluation run of s3nh/SeverusWestLake-7B-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [s3nh/SeverusWestLake-7B-DPO](https://huggingface.co/s3nh/SeverusWestLake-7B-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_s3nh__SeverusWestLake-7B-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T22:42:18.214091](https://huggingface.co/datasets/open-llm-leaderboard/details_s3nh__SeverusWestLake-7B-DPO/blob/main/results_2024-02-04T22-42-18.214091.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6528292813248195,
"acc_stderr": 0.03206050479104449,
"acc_norm": 0.6518483054555606,
"acc_norm_stderr": 0.03273994102861642,
"mc1": 0.565483476132191,
"mc1_stderr": 0.01735273874925956,
"mc2": 0.7149221684392081,
"mc2_stderr": 0.014890972991763073
},
"harness|arc:challenge|25": {
"acc": 0.7056313993174061,
"acc_stderr": 0.01331852846053942,
"acc_norm": 0.7218430034129693,
"acc_norm_stderr": 0.013094469919538809
},
"harness|hellaswag|10": {
"acc": 0.7165903206532563,
"acc_stderr": 0.004497325533959636,
"acc_norm": 0.8893646683927504,
"acc_norm_stderr": 0.0031303894668331957
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.0255428468174005,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.0255428468174005
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652457,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652457
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461763,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461763
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926924,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926924
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179326,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179326
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508297,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508297
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42793296089385474,
"acc_stderr": 0.016547887997416112,
"acc_norm": 0.42793296089385474,
"acc_norm_stderr": 0.016547887997416112
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.023788583551658533,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.023788583551658533
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.02977945095730307,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.02977945095730307
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47327249022164275,
"acc_stderr": 0.012751977967676008,
"acc_norm": 0.47327249022164275,
"acc_norm_stderr": 0.012751977967676008
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160896,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160896
},
"harness|truthfulqa:mc|0": {
"mc1": 0.565483476132191,
"mc1_stderr": 0.01735273874925956,
"mc2": 0.7149221684392081,
"mc2_stderr": 0.014890972991763073
},
"harness|winogrande|5": {
"acc": 0.8610891870560379,
"acc_stderr": 0.0097202009074021
},
"harness|gsm8k|5": {
"acc": 0.6914329037149356,
"acc_stderr": 0.012723076049815898
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_s3nh__SeverusWestLake-7B-DPO | [
"region:us"
] | 2024-02-04T22:44:35+00:00 | {"pretty_name": "Evaluation run of s3nh/SeverusWestLake-7B-DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [s3nh/SeverusWestLake-7B-DPO](https://huggingface.co/s3nh/SeverusWestLake-7B-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_s3nh__SeverusWestLake-7B-DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T22:42:18.214091](https://huggingface.co/datasets/open-llm-leaderboard/details_s3nh__SeverusWestLake-7B-DPO/blob/main/results_2024-02-04T22-42-18.214091.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6528292813248195,\n \"acc_stderr\": 0.03206050479104449,\n \"acc_norm\": 0.6518483054555606,\n \"acc_norm_stderr\": 0.03273994102861642,\n \"mc1\": 0.565483476132191,\n \"mc1_stderr\": 0.01735273874925956,\n \"mc2\": 0.7149221684392081,\n \"mc2_stderr\": 0.014890972991763073\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7056313993174061,\n \"acc_stderr\": 0.01331852846053942,\n \"acc_norm\": 0.7218430034129693,\n \"acc_norm_stderr\": 0.013094469919538809\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7165903206532563,\n \"acc_stderr\": 0.004497325533959636,\n \"acc_norm\": 0.8893646683927504,\n \"acc_norm_stderr\": 0.0031303894668331957\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.0255428468174005,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.0255428468174005\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652457,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652457\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926924,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926924\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508297,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508297\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n \"acc_stderr\": 0.016547887997416112,\n \"acc_norm\": 0.42793296089385474,\n \"acc_norm_stderr\": 0.016547887997416112\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.023788583551658533,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.023788583551658533\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.02977945095730307,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.02977945095730307\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n \"acc_stderr\": 0.012751977967676008,\n \"acc_norm\": 0.47327249022164275,\n \"acc_norm_stderr\": 0.012751977967676008\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160896,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160896\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.565483476132191,\n \"mc1_stderr\": 0.01735273874925956,\n \"mc2\": 0.7149221684392081,\n \"mc2_stderr\": 0.014890972991763073\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8610891870560379,\n \"acc_stderr\": 0.0097202009074021\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6914329037149356,\n \"acc_stderr\": 0.012723076049815898\n }\n}\n```", "repo_url": "https://huggingface.co/s3nh/SeverusWestLake-7B-DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|arc:challenge|25_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|gsm8k|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hellaswag|10_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T22-42-18.214091.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["**/details_harness|winogrande|5_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T22-42-18.214091.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T22_42_18.214091", "path": ["results_2024-02-04T22-42-18.214091.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T22-42-18.214091.parquet"]}]}]} | 2024-02-04T22:44:59+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of s3nh/SeverusWestLake-7B-DPO
Dataset automatically created during the evaluation run of model s3nh/SeverusWestLake-7B-DPO on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T22:42:18.214091(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of s3nh/SeverusWestLake-7B-DPO\n\n\n\nDataset automatically created during the evaluation run of model s3nh/SeverusWestLake-7B-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T22:42:18.214091(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of s3nh/SeverusWestLake-7B-DPO\n\n\n\nDataset automatically created during the evaluation run of model s3nh/SeverusWestLake-7B-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T22:42:18.214091(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
0f91bda18e26b71f4194f8f8758b28d84a227915 |
# Dataset Card for Evaluation run of minghaowu/phi-2-OpenHermes-2.5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [minghaowu/phi-2-OpenHermes-2.5](https://huggingface.co/minghaowu/phi-2-OpenHermes-2.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_minghaowu__phi-2-OpenHermes-2.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-05T03:58:10.319056](https://huggingface.co/datasets/open-llm-leaderboard/details_minghaowu__phi-2-OpenHermes-2.5/blob/main/results_2024-02-05T03-58-10.319056.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5417779286978951,
"acc_stderr": 0.03387715104203166,
"acc_norm": 0.5515584103266279,
"acc_norm_stderr": 0.03480262412206108,
"mc1": 0.33047735618115054,
"mc1_stderr": 0.0164667696136983,
"mc2": 0.48101216272366176,
"mc2_stderr": 0.014866987436710487
},
"harness|arc:challenge|25": {
"acc": 0.5332764505119454,
"acc_stderr": 0.0145789958596058,
"acc_norm": 0.5648464163822525,
"acc_norm_stderr": 0.014487986197186043
},
"harness|hellaswag|10": {
"acc": 0.5468034256124278,
"acc_stderr": 0.004967872475383275,
"acc_norm": 0.738797052380004,
"acc_norm_stderr": 0.00438392514747874
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874143,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874143
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5660377358490566,
"acc_stderr": 0.030503292013342592,
"acc_norm": 0.5660377358490566,
"acc_norm_stderr": 0.030503292013342592
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273956,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273956
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502707,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.041641887201693775,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.041641887201693775
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6516129032258065,
"acc_stderr": 0.02710482632810094,
"acc_norm": 0.6516129032258065,
"acc_norm_stderr": 0.02710482632810094
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.034991131376767445,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.034991131376767445
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03756335775187898,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03756335775187898
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7461139896373057,
"acc_stderr": 0.03141024780565322,
"acc_norm": 0.7461139896373057,
"acc_norm_stderr": 0.03141024780565322
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5384615384615384,
"acc_stderr": 0.025275892070240644,
"acc_norm": 0.5384615384615384,
"acc_norm_stderr": 0.025275892070240644
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.03221943636566196,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.03221943636566196
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7651376146788991,
"acc_stderr": 0.018175110510343574,
"acc_norm": 0.7651376146788991,
"acc_norm_stderr": 0.018175110510343574
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.033321399446680854,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.033321399446680854
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460285,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460285
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503947,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503947
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.025598193686652258,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.025598193686652258
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7100893997445722,
"acc_stderr": 0.016225017944770978,
"acc_norm": 0.7100893997445722,
"acc_norm_stderr": 0.016225017944770978
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6329479768786127,
"acc_stderr": 0.02595005433765408,
"acc_norm": 0.6329479768786127,
"acc_norm_stderr": 0.02595005433765408
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2201117318435754,
"acc_stderr": 0.013856994024227173,
"acc_norm": 0.2201117318435754,
"acc_norm_stderr": 0.013856994024227173
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.028110928492809068,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.028110928492809068
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5787781350482315,
"acc_stderr": 0.028043399858210628,
"acc_norm": 0.5787781350482315,
"acc_norm_stderr": 0.028043399858210628
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.02751374728437942,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.02751374728437942
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.02935491115994098,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.02935491115994098
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4172099087353325,
"acc_stderr": 0.012593959992906417,
"acc_norm": 0.4172099087353325,
"acc_norm_stderr": 0.012593959992906417
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.40808823529411764,
"acc_stderr": 0.029855261393483927,
"acc_norm": 0.40808823529411764,
"acc_norm_stderr": 0.029855261393483927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5604575163398693,
"acc_stderr": 0.020079420408087918,
"acc_norm": 0.5604575163398693,
"acc_norm_stderr": 0.020079420408087918
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5727272727272728,
"acc_stderr": 0.04738198703545483,
"acc_norm": 0.5727272727272728,
"acc_norm_stderr": 0.04738198703545483
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.029705284056772443,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.029705284056772443
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.672514619883041,
"acc_stderr": 0.03599335771456027,
"acc_norm": 0.672514619883041,
"acc_norm_stderr": 0.03599335771456027
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33047735618115054,
"mc1_stderr": 0.0164667696136983,
"mc2": 0.48101216272366176,
"mc2_stderr": 0.014866987436710487
},
"harness|winogrande|5": {
"acc": 0.7300710339384373,
"acc_stderr": 0.012476433372002592
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_minghaowu__phi-2-OpenHermes-2.5 | [
"region:us"
] | 2024-02-04T22:50:54+00:00 | {"pretty_name": "Evaluation run of minghaowu/phi-2-OpenHermes-2.5", "dataset_summary": "Dataset automatically created during the evaluation run of model [minghaowu/phi-2-OpenHermes-2.5](https://huggingface.co/minghaowu/phi-2-OpenHermes-2.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_minghaowu__phi-2-OpenHermes-2.5\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-05T03:58:10.319056](https://huggingface.co/datasets/open-llm-leaderboard/details_minghaowu__phi-2-OpenHermes-2.5/blob/main/results_2024-02-05T03-58-10.319056.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5417779286978951,\n \"acc_stderr\": 0.03387715104203166,\n \"acc_norm\": 0.5515584103266279,\n \"acc_norm_stderr\": 0.03480262412206108,\n \"mc1\": 0.33047735618115054,\n \"mc1_stderr\": 0.0164667696136983,\n \"mc2\": 0.48101216272366176,\n \"mc2_stderr\": 0.014866987436710487\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5332764505119454,\n \"acc_stderr\": 0.0145789958596058,\n \"acc_norm\": 0.5648464163822525,\n \"acc_norm_stderr\": 0.014487986197186043\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5468034256124278,\n \"acc_stderr\": 0.004967872475383275,\n \"acc_norm\": 0.738797052380004,\n \"acc_norm_stderr\": 0.00438392514747874\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874143,\n \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874143\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5660377358490566,\n \"acc_stderr\": 0.030503292013342592,\n \"acc_norm\": 0.5660377358490566,\n \"acc_norm_stderr\": 0.030503292013342592\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n \"acc_stderr\": 0.03809342081273956,\n \"acc_norm\": 0.5202312138728323,\n \"acc_norm_stderr\": 0.03809342081273956\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667492,\n \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667492\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n \"acc_stderr\": 0.044895393502707,\n \"acc_norm\": 0.3508771929824561,\n \"acc_norm_stderr\": 0.044895393502707\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.041641887201693775,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.041641887201693775\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6516129032258065,\n \"acc_stderr\": 0.02710482632810094,\n \"acc_norm\": 0.6516129032258065,\n \"acc_norm_stderr\": 0.02710482632810094\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.034991131376767445,\n \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.034991131376767445\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03756335775187898,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03756335775187898\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7461139896373057,\n \"acc_stderr\": 0.03141024780565322,\n \"acc_norm\": 0.7461139896373057,\n \"acc_norm_stderr\": 0.03141024780565322\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5384615384615384,\n \"acc_stderr\": 0.025275892070240644,\n \"acc_norm\": 0.5384615384615384,\n \"acc_norm_stderr\": 0.025275892070240644\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.03221943636566196,\n \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.03221943636566196\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7651376146788991,\n \"acc_stderr\": 0.018175110510343574,\n \"acc_norm\": 0.7651376146788991,\n \"acc_norm_stderr\": 0.018175110510343574\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.033321399446680854,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.033321399446680854\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460285,\n \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460285\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\": 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664743,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664743\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503947,\n \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503947\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n \"acc_stderr\": 0.025598193686652258,\n \"acc_norm\": 0.811965811965812,\n \"acc_norm_stderr\": 0.025598193686652258\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7100893997445722,\n \"acc_stderr\": 0.016225017944770978,\n \"acc_norm\": 0.7100893997445722,\n \"acc_norm_stderr\": 0.016225017944770978\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.02595005433765408,\n \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.02595005433765408\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2201117318435754,\n \"acc_stderr\": 0.013856994024227173,\n \"acc_norm\": 0.2201117318435754,\n \"acc_norm_stderr\": 0.013856994024227173\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.028110928492809068,\n \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.028110928492809068\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5787781350482315,\n \"acc_stderr\": 0.028043399858210628,\n \"acc_norm\": 0.5787781350482315,\n \"acc_norm_stderr\": 0.028043399858210628\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.02751374728437942,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.02751374728437942\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.41134751773049644,\n \"acc_stderr\": 0.02935491115994098,\n \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.02935491115994098\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4172099087353325,\n \"acc_stderr\": 0.012593959992906417,\n \"acc_norm\": 0.4172099087353325,\n \"acc_norm_stderr\": 0.012593959992906417\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.40808823529411764,\n \"acc_stderr\": 0.029855261393483927,\n \"acc_norm\": 0.40808823529411764,\n \"acc_norm_stderr\": 0.029855261393483927\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5604575163398693,\n \"acc_stderr\": 0.020079420408087918,\n \"acc_norm\": 0.5604575163398693,\n \"acc_norm_stderr\": 0.020079420408087918\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.5727272727272728,\n \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n \"acc_stderr\": 0.029705284056772443,\n \"acc_norm\": 0.7711442786069652,\n \"acc_norm_stderr\": 0.029705284056772443\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.672514619883041,\n \"acc_stderr\": 0.03599335771456027,\n \"acc_norm\": 0.672514619883041,\n \"acc_norm_stderr\": 0.03599335771456027\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33047735618115054,\n \"mc1_stderr\": 0.0164667696136983,\n \"mc2\": 0.48101216272366176,\n \"mc2_stderr\": 0.014866987436710487\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7300710339384373,\n \"acc_stderr\": 0.012476433372002592\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/minghaowu/phi-2-OpenHermes-2.5", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|arc:challenge|25_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|arc:challenge|25_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|gsm8k|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|gsm8k|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hellaswag|10_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hellaswag|10_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T22-49-11.717360.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T03-58-10.319056.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["**/details_harness|winogrande|5_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["**/details_harness|winogrande|5_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-05T03-58-10.319056.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T22_49_11.717360", "path": ["results_2024-02-04T22-49-11.717360.parquet"]}, {"split": "2024_02_05T03_58_10.319056", "path": ["results_2024-02-05T03-58-10.319056.parquet"]}, {"split": "latest", "path": ["results_2024-02-05T03-58-10.319056.parquet"]}]}]} | 2024-02-05T04:00:01+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of minghaowu/phi-2-OpenHermes-2.5
Dataset automatically created during the evaluation run of model minghaowu/phi-2-OpenHermes-2.5 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-05T03:58:10.319056(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of minghaowu/phi-2-OpenHermes-2.5\n\n\n\nDataset automatically created during the evaluation run of model minghaowu/phi-2-OpenHermes-2.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T03:58:10.319056(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of minghaowu/phi-2-OpenHermes-2.5\n\n\n\nDataset automatically created during the evaluation run of model minghaowu/phi-2-OpenHermes-2.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T03:58:10.319056(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
5755d334b648e90f10c37663caf3d7476c963534 |
# Dataset Card for Evaluation run of Eric111/NeuralBeagleOpenChat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Eric111/NeuralBeagleOpenChat](https://huggingface.co/Eric111/NeuralBeagleOpenChat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Eric111__NeuralBeagleOpenChat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T23:07:18.461587](https://huggingface.co/datasets/open-llm-leaderboard/details_Eric111__NeuralBeagleOpenChat/blob/main/results_2024-02-04T23-07-18.461587.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6606606326765095,
"acc_stderr": 0.03176172876813684,
"acc_norm": 0.6604815290872796,
"acc_norm_stderr": 0.03242278367605966,
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.6091160375508784,
"mc2_stderr": 0.015202500436432948
},
"harness|arc:challenge|25": {
"acc": 0.6629692832764505,
"acc_stderr": 0.013813476652902272,
"acc_norm": 0.7030716723549488,
"acc_norm_stderr": 0.013352025976725223
},
"harness|hellaswag|10": {
"acc": 0.6699860585540729,
"acc_stderr": 0.004692567655961763,
"acc_norm": 0.8625771758613822,
"acc_norm_stderr": 0.0034358953866922537
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926605,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.03396116205845333,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.03396116205845333
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.03514942551267438,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.03514942551267438
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287533,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287533
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406783,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483016,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483016
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033477,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033477
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083008,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083008
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8605504587155963,
"acc_stderr": 0.014852421490033059,
"acc_norm": 0.8605504587155963,
"acc_norm_stderr": 0.014852421490033059
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8725490196078431,
"acc_stderr": 0.02340553048084631,
"acc_norm": 0.8725490196078431,
"acc_norm_stderr": 0.02340553048084631
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579654,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579654
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8390804597701149,
"acc_stderr": 0.013140225515611724,
"acc_norm": 0.8390804597701149,
"acc_norm_stderr": 0.013140225515611724
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7601156069364162,
"acc_stderr": 0.022989592543123567,
"acc_norm": 0.7601156069364162,
"acc_norm_stderr": 0.022989592543123567
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37206703910614525,
"acc_stderr": 0.016165847583563295,
"acc_norm": 0.37206703910614525,
"acc_norm_stderr": 0.016165847583563295
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.02378858355165854,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.02378858355165854
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48826597131681876,
"acc_stderr": 0.012766719019686724,
"acc_norm": 0.48826597131681876,
"acc_norm_stderr": 0.012766719019686724
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.027678468642144717,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.027678468642144717
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.6091160375508784,
"mc2_stderr": 0.015202500436432948
},
"harness|winogrande|5": {
"acc": 0.8208366219415943,
"acc_stderr": 0.010777949156047986
},
"harness|gsm8k|5": {
"acc": 0.7437452615617892,
"acc_stderr": 0.012025145867332844
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Eric111__NeuralBeagleOpenChat | [
"region:us"
] | 2024-02-04T23:09:38+00:00 | {"pretty_name": "Evaluation run of Eric111/NeuralBeagleOpenChat", "dataset_summary": "Dataset automatically created during the evaluation run of model [Eric111/NeuralBeagleOpenChat](https://huggingface.co/Eric111/NeuralBeagleOpenChat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Eric111__NeuralBeagleOpenChat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T23:07:18.461587](https://huggingface.co/datasets/open-llm-leaderboard/details_Eric111__NeuralBeagleOpenChat/blob/main/results_2024-02-04T23-07-18.461587.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6606606326765095,\n \"acc_stderr\": 0.03176172876813684,\n \"acc_norm\": 0.6604815290872796,\n \"acc_norm_stderr\": 0.03242278367605966,\n \"mc1\": 0.42962056303549573,\n \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.6091160375508784,\n \"mc2_stderr\": 0.015202500436432948\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6629692832764505,\n \"acc_stderr\": 0.013813476652902272,\n \"acc_norm\": 0.7030716723549488,\n \"acc_norm_stderr\": 0.013352025976725223\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6699860585540729,\n \"acc_stderr\": 0.004692567655961763,\n \"acc_norm\": 0.8625771758613822,\n \"acc_norm_stderr\": 0.0034358953866922537\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n \"acc_stderr\": 0.03396116205845333,\n \"acc_norm\": 0.7916666666666666,\n \"acc_norm_stderr\": 0.03396116205845333\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.03514942551267438,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.03514942551267438\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287533,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287533\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483016,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483016\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033477,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033477\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.02366129639396428,\n \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.02366129639396428\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8605504587155963,\n \"acc_stderr\": 0.014852421490033059,\n \"acc_norm\": 0.8605504587155963,\n \"acc_norm_stderr\": 0.014852421490033059\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8725490196078431,\n \"acc_stderr\": 0.02340553048084631,\n \"acc_norm\": 0.8725490196078431,\n \"acc_norm_stderr\": 0.02340553048084631\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579654,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579654\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8390804597701149,\n \"acc_stderr\": 0.013140225515611724,\n \"acc_norm\": 0.8390804597701149,\n \"acc_norm_stderr\": 0.013140225515611724\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7601156069364162,\n \"acc_stderr\": 0.022989592543123567,\n \"acc_norm\": 0.7601156069364162,\n \"acc_norm_stderr\": 0.022989592543123567\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37206703910614525,\n \"acc_stderr\": 0.016165847583563295,\n \"acc_norm\": 0.37206703910614525,\n \"acc_norm_stderr\": 0.016165847583563295\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.02378858355165854,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.02378858355165854\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48826597131681876,\n \"acc_stderr\": 0.012766719019686724,\n \"acc_norm\": 0.48826597131681876,\n \"acc_norm_stderr\": 0.012766719019686724\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.027678468642144717,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.027678468642144717\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42962056303549573,\n \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.6091160375508784,\n \"mc2_stderr\": 0.015202500436432948\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8208366219415943,\n \"acc_stderr\": 0.010777949156047986\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7437452615617892,\n \"acc_stderr\": 0.012025145867332844\n }\n}\n```", "repo_url": "https://huggingface.co/Eric111/NeuralBeagleOpenChat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|arc:challenge|25_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|gsm8k|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hellaswag|10_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T23-07-18.461587.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["**/details_harness|winogrande|5_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T23-07-18.461587.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T23_07_18.461587", "path": ["results_2024-02-04T23-07-18.461587.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T23-07-18.461587.parquet"]}]}]} | 2024-02-04T23:10:01+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Eric111/NeuralBeagleOpenChat
Dataset automatically created during the evaluation run of model Eric111/NeuralBeagleOpenChat on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T23:07:18.461587(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Eric111/NeuralBeagleOpenChat\n\n\n\nDataset automatically created during the evaluation run of model Eric111/NeuralBeagleOpenChat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T23:07:18.461587(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Eric111/NeuralBeagleOpenChat\n\n\n\nDataset automatically created during the evaluation run of model Eric111/NeuralBeagleOpenChat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T23:07:18.461587(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
6e391f01d49a5fead324f5d02b3ede59150772ef | # Dataset Card for "Wikipedia-Hindi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | zicsx/Wikipedia-Hindi | [
"region:us"
] | 2024-02-04T23:41:14+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 576464909.5937778, "num_examples": 154867}], "download_size": 216951489, "dataset_size": 576464909.5937778}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-04T23:49:20+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "Wikipedia-Hindi"
More Information needed | [
"# Dataset Card for \"Wikipedia-Hindi\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"Wikipedia-Hindi\"\n\nMore Information needed"
] |
399fae71b33d96d2ce65e1609391305bf575302f |
# Dataset Card for Evaluation run of AIGym/TinyLlama-1.1B-2.5T-chat-and-function-calling
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AIGym/TinyLlama-1.1B-2.5T-chat-and-function-calling](https://huggingface.co/AIGym/TinyLlama-1.1B-2.5T-chat-and-function-calling) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AIGym__TinyLlama-1.1B-2.5T-chat-and-function-calling",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T23:43:15.860527](https://huggingface.co/datasets/open-llm-leaderboard/details_AIGym__TinyLlama-1.1B-2.5T-chat-and-function-calling/blob/main/results_2024-02-04T23-43-15.860527.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26912346478859156,
"acc_stderr": 0.03120091714101033,
"acc_norm": 0.27025742576226947,
"acc_norm_stderr": 0.03196861086820363,
"mc1": 0.22766217870257038,
"mc1_stderr": 0.01467925503211107,
"mc2": 0.38916106981756643,
"mc2_stderr": 0.014160770891106955
},
"harness|arc:challenge|25": {
"acc": 0.32337883959044367,
"acc_stderr": 0.013669421630012136,
"acc_norm": 0.3438566552901024,
"acc_norm_stderr": 0.013880644570156217
},
"harness|hellaswag|10": {
"acc": 0.4509061939852619,
"acc_stderr": 0.004965670398127352,
"acc_norm": 0.5960963951404102,
"acc_norm_stderr": 0.004896757857022549
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322716,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322716
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2641509433962264,
"acc_stderr": 0.02713429162874171,
"acc_norm": 0.2641509433962264,
"acc_norm_stderr": 0.02713429162874171
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.03063114553919882,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.03063114553919882
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.31063829787234043,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.31063829787234043,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708614,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708614
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790606,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25483870967741934,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.25483870967741934,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.24630541871921183,
"acc_stderr": 0.03031509928561773,
"acc_norm": 0.24630541871921183,
"acc_norm_stderr": 0.03031509928561773
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.32727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.32727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25252525252525254,
"acc_stderr": 0.030954055470365914,
"acc_norm": 0.25252525252525254,
"acc_norm_stderr": 0.030954055470365914
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27461139896373055,
"acc_stderr": 0.03221024508041156,
"acc_norm": 0.27461139896373055,
"acc_norm_stderr": 0.03221024508041156
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.31025641025641026,
"acc_stderr": 0.02345467488940429,
"acc_norm": 0.31025641025641026,
"acc_norm_stderr": 0.02345467488940429
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.026841514322958955,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.026841514322958955
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.035118075718047245,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.035118075718047245
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23486238532110093,
"acc_stderr": 0.018175110510343578,
"acc_norm": 0.23486238532110093,
"acc_norm_stderr": 0.018175110510343578
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28921568627450983,
"acc_stderr": 0.03182231867647553,
"acc_norm": 0.28921568627450983,
"acc_norm_stderr": 0.03182231867647553
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.24050632911392406,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.24050632911392406,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3811659192825112,
"acc_stderr": 0.03259625118416828,
"acc_norm": 0.3811659192825112,
"acc_norm_stderr": 0.03259625118416828
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.27607361963190186,
"acc_stderr": 0.03512385283705051,
"acc_norm": 0.27607361963190186,
"acc_norm_stderr": 0.03512385283705051
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690875,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690875
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.02891120880274948,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.02891120880274948
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2681992337164751,
"acc_stderr": 0.01584243083526945,
"acc_norm": 0.2681992337164751,
"acc_norm_stderr": 0.01584243083526945
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23410404624277456,
"acc_stderr": 0.02279711027807114,
"acc_norm": 0.23410404624277456,
"acc_norm_stderr": 0.02279711027807114
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.014400296429225627,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.014400296429225627
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.20261437908496732,
"acc_stderr": 0.023015446877985672,
"acc_norm": 0.20261437908496732,
"acc_norm_stderr": 0.023015446877985672
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3215434083601286,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.3215434083601286,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2623456790123457,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.2623456790123457,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.20567375886524822,
"acc_stderr": 0.02411213895047188,
"acc_norm": 0.20567375886524822,
"acc_norm_stderr": 0.02411213895047188
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2470664928292047,
"acc_stderr": 0.011015752255279336,
"acc_norm": 0.2470664928292047,
"acc_norm_stderr": 0.011015752255279336
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4375,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.016906615927288152,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.016906615927288152
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.15918367346938775,
"acc_stderr": 0.023420972069166348,
"acc_norm": 0.15918367346938775,
"acc_norm_stderr": 0.023420972069166348
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22388059701492538,
"acc_stderr": 0.02947525023601719,
"acc_norm": 0.22388059701492538,
"acc_norm_stderr": 0.02947525023601719
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3132530120481928,
"acc_stderr": 0.03610805018031024,
"acc_norm": 0.3132530120481928,
"acc_norm_stderr": 0.03610805018031024
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03218093795602357,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03218093795602357
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22766217870257038,
"mc1_stderr": 0.01467925503211107,
"mc2": 0.38916106981756643,
"mc2_stderr": 0.014160770891106955
},
"harness|winogrande|5": {
"acc": 0.6195737963693765,
"acc_stderr": 0.013644727908656831
},
"harness|gsm8k|5": {
"acc": 0.017437452615617893,
"acc_stderr": 0.0036054868679982663
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AIGym__TinyLlama-1.1B-2.5T-chat-and-function-calling | [
"region:us"
] | 2024-02-04T23:45:04+00:00 | {"pretty_name": "Evaluation run of AIGym/TinyLlama-1.1B-2.5T-chat-and-function-calling", "dataset_summary": "Dataset automatically created during the evaluation run of model [AIGym/TinyLlama-1.1B-2.5T-chat-and-function-calling](https://huggingface.co/AIGym/TinyLlama-1.1B-2.5T-chat-and-function-calling) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AIGym__TinyLlama-1.1B-2.5T-chat-and-function-calling\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T23:43:15.860527](https://huggingface.co/datasets/open-llm-leaderboard/details_AIGym__TinyLlama-1.1B-2.5T-chat-and-function-calling/blob/main/results_2024-02-04T23-43-15.860527.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26912346478859156,\n \"acc_stderr\": 0.03120091714101033,\n \"acc_norm\": 0.27025742576226947,\n \"acc_norm_stderr\": 0.03196861086820363,\n \"mc1\": 0.22766217870257038,\n \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.38916106981756643,\n \"mc2_stderr\": 0.014160770891106955\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.32337883959044367,\n \"acc_stderr\": 0.013669421630012136,\n \"acc_norm\": 0.3438566552901024,\n \"acc_norm_stderr\": 0.013880644570156217\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4509061939852619,\n \"acc_stderr\": 0.004965670398127352,\n \"acc_norm\": 0.5960963951404102,\n \"acc_norm_stderr\": 0.004896757857022549\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322716,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322716\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03317672787533157,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03317672787533157\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.02713429162874171,\n \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.02713429162874171\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n \"acc_stderr\": 0.03063114553919882,\n \"acc_norm\": 0.2023121387283237,\n \"acc_norm_stderr\": 0.03063114553919882\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.31063829787234043,\n \"acc_stderr\": 0.03025123757921317,\n \"acc_norm\": 0.31063829787234043,\n \"acc_norm_stderr\": 0.03025123757921317\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708614,\n \"acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708614\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n \"acc_stderr\": 0.03764950879790606,\n \"acc_norm\": 0.23015873015873015,\n \"acc_norm_stderr\": 0.03764950879790606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25483870967741934,\n \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.25483870967741934,\n \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.24630541871921183,\n \"acc_stderr\": 0.03031509928561773,\n \"acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.03031509928561773\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.32727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n \"acc_norm\": 0.32727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365914,\n \"acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365914\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.27461139896373055,\n \"acc_stderr\": 0.03221024508041156,\n \"acc_norm\": 0.27461139896373055,\n \"acc_norm_stderr\": 0.03221024508041156\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.31025641025641026,\n \"acc_stderr\": 0.02345467488940429,\n \"acc_norm\": 0.31025641025641026,\n \"acc_norm_stderr\": 0.02345467488940429\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.026841514322958955,\n \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.026841514322958955\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.24503311258278146,\n \"acc_stderr\": 0.035118075718047245,\n \"acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.035118075718047245\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23486238532110093,\n \"acc_stderr\": 0.018175110510343578,\n \"acc_norm\": 0.23486238532110093,\n \"acc_norm_stderr\": 0.018175110510343578\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.28921568627450983,\n \"acc_stderr\": 0.03182231867647553,\n \"acc_norm\": 0.28921568627450983,\n \"acc_norm_stderr\": 0.03182231867647553\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.24050632911392406,\n \"acc_stderr\": 0.027820781981149685,\n \"acc_norm\": 0.24050632911392406,\n \"acc_norm_stderr\": 0.027820781981149685\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3811659192825112,\n \"acc_stderr\": 0.03259625118416828,\n \"acc_norm\": 0.3811659192825112,\n \"acc_norm_stderr\": 0.03259625118416828\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516302,\n \"acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516302\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.27607361963190186,\n \"acc_stderr\": 0.03512385283705051,\n \"acc_norm\": 0.27607361963190186,\n \"acc_norm_stderr\": 0.03512385283705051\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690875,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690875\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n \"acc_stderr\": 0.02891120880274948,\n \"acc_norm\": 0.26495726495726496,\n \"acc_norm_stderr\": 0.02891120880274948\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2681992337164751,\n \"acc_stderr\": 0.01584243083526945,\n \"acc_norm\": 0.2681992337164751,\n \"acc_norm_stderr\": 0.01584243083526945\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.23410404624277456,\n \"acc_stderr\": 0.02279711027807114,\n \"acc_norm\": 0.23410404624277456,\n \"acc_norm_stderr\": 0.02279711027807114\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n \"acc_stderr\": 0.014400296429225627,\n \"acc_norm\": 0.24581005586592178,\n \"acc_norm_stderr\": 0.014400296429225627\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.20261437908496732,\n \"acc_stderr\": 0.023015446877985672,\n \"acc_norm\": 0.20261437908496732,\n \"acc_norm_stderr\": 0.023015446877985672\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3215434083601286,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.3215434083601286,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.02447722285613511,\n \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.02447722285613511\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.20567375886524822,\n \"acc_stderr\": 0.02411213895047188,\n \"acc_norm\": 0.20567375886524822,\n \"acc_norm_stderr\": 0.02411213895047188\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2470664928292047,\n \"acc_stderr\": 0.011015752255279336,\n \"acc_norm\": 0.2470664928292047,\n \"acc_norm_stderr\": 0.011015752255279336\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.030134614954403924,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.030134614954403924\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.016906615927288152,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.016906615927288152\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.2818181818181818,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.15918367346938775,\n \"acc_stderr\": 0.023420972069166348,\n \"acc_norm\": 0.15918367346938775,\n \"acc_norm_stderr\": 0.023420972069166348\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22388059701492538,\n \"acc_stderr\": 0.02947525023601719,\n \"acc_norm\": 0.22388059701492538,\n \"acc_norm_stderr\": 0.02947525023601719\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n \"acc_stderr\": 0.03610805018031024,\n \"acc_norm\": 0.3132530120481928,\n \"acc_norm_stderr\": 0.03610805018031024\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03218093795602357,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03218093795602357\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22766217870257038,\n \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.38916106981756643,\n \"mc2_stderr\": 0.014160770891106955\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6195737963693765,\n \"acc_stderr\": 0.013644727908656831\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.017437452615617893,\n \"acc_stderr\": 0.0036054868679982663\n }\n}\n```", "repo_url": "https://huggingface.co/AIGym/TinyLlama-1.1B-2.5T-chat-and-function-calling", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|arc:challenge|25_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|gsm8k|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hellaswag|10_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T23-43-15.860527.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["**/details_harness|winogrande|5_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T23-43-15.860527.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T23_43_15.860527", "path": ["results_2024-02-04T23-43-15.860527.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T23-43-15.860527.parquet"]}]}]} | 2024-02-04T23:45:27+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AIGym/TinyLlama-1.1B-2.5T-chat-and-function-calling
Dataset automatically created during the evaluation run of model AIGym/TinyLlama-1.1B-2.5T-chat-and-function-calling on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T23:43:15.860527(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AIGym/TinyLlama-1.1B-2.5T-chat-and-function-calling\n\n\n\nDataset automatically created during the evaluation run of model AIGym/TinyLlama-1.1B-2.5T-chat-and-function-calling on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T23:43:15.860527(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AIGym/TinyLlama-1.1B-2.5T-chat-and-function-calling\n\n\n\nDataset automatically created during the evaluation run of model AIGym/TinyLlama-1.1B-2.5T-chat-and-function-calling on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T23:43:15.860527(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
e06b359a30900c35324f9c6564cbe0f06d666f4c |
# Dataset Card for Evaluation run of Josephgflowers/distillgpt2Cinder
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Josephgflowers/distillgpt2Cinder](https://huggingface.co/Josephgflowers/distillgpt2Cinder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Josephgflowers__distillgpt2Cinder",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T23:46:27.372796](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__distillgpt2Cinder/blob/main/results_2024-02-04T23-46-27.372796.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24948426094911816,
"acc_stderr": 0.03052443478922094,
"acc_norm": 0.2500035098132215,
"acc_norm_stderr": 0.031309813832633156,
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.43961123482428127,
"mc2_stderr": 0.01535210261478119
},
"harness|arc:challenge|25": {
"acc": 0.21075085324232082,
"acc_stderr": 0.01191827175485218,
"acc_norm": 0.24488054607508533,
"acc_norm_stderr": 0.012566273985131356
},
"harness|hellaswag|10": {
"acc": 0.2713602867954591,
"acc_stderr": 0.004437527732850682,
"acc_norm": 0.27235610436168095,
"acc_norm_stderr": 0.004442623590846321
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21710526315789475,
"acc_stderr": 0.033550453048829226,
"acc_norm": 0.21710526315789475,
"acc_norm_stderr": 0.033550453048829226
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23018867924528302,
"acc_stderr": 0.025907897122408173,
"acc_norm": 0.23018867924528302,
"acc_norm_stderr": 0.025907897122408173
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.16,
"acc_stderr": 0.036845294917747094,
"acc_norm": 0.16,
"acc_norm_stderr": 0.036845294917747094
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.03368762932259431,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.03368762932259431
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617746,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617746
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20425531914893616,
"acc_stderr": 0.026355158413349424,
"acc_norm": 0.20425531914893616,
"acc_norm_stderr": 0.026355158413349424
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.023135287974325635,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.023135287974325635
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.14285714285714285,
"acc_stderr": 0.03129843185743808,
"acc_norm": 0.14285714285714285,
"acc_norm_stderr": 0.03129843185743808
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2709677419354839,
"acc_stderr": 0.02528441611490016,
"acc_norm": 0.2709677419354839,
"acc_norm_stderr": 0.02528441611490016
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3282828282828283,
"acc_stderr": 0.03345678422756776,
"acc_norm": 0.3282828282828283,
"acc_norm_stderr": 0.03345678422756776
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2153846153846154,
"acc_stderr": 0.020843034557462878,
"acc_norm": 0.2153846153846154,
"acc_norm_stderr": 0.020843034557462878
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073838,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073838
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.02702543349888237,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.02702543349888237
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3284403669724771,
"acc_stderr": 0.020135902797298395,
"acc_norm": 0.3284403669724771,
"acc_norm_stderr": 0.020135902797298395
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.028353212866863445,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.028353212866863445
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.14349775784753363,
"acc_stderr": 0.023529371269618193,
"acc_norm": 0.14349775784753363,
"acc_norm_stderr": 0.023529371269618193
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.1984732824427481,
"acc_stderr": 0.03498149385462471,
"acc_norm": 0.1984732824427481,
"acc_norm_stderr": 0.03498149385462471
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052191,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052191
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2883435582822086,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.2883435582822086,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952687,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952687
},
"harness|hendrycksTest-management|5": {
"acc": 0.1553398058252427,
"acc_stderr": 0.03586594738573973,
"acc_norm": 0.1553398058252427,
"acc_norm_stderr": 0.03586594738573973
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.02860595370200424,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.02860595370200424
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.21328224776500637,
"acc_stderr": 0.014648172749593522,
"acc_norm": 0.21328224776500637,
"acc_norm_stderr": 0.014648172749593522
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27124183006535946,
"acc_stderr": 0.025457756696667878,
"acc_norm": 0.27124183006535946,
"acc_norm_stderr": 0.025457756696667878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02492200116888633,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02492200116888633
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.025518731049537776,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.025518731049537776
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2977941176470588,
"acc_stderr": 0.02777829870154544,
"acc_norm": 0.2977941176470588,
"acc_norm_stderr": 0.02777829870154544
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.272875816993464,
"acc_stderr": 0.018020474148393577,
"acc_norm": 0.272875816993464,
"acc_norm_stderr": 0.018020474148393577
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24897959183673468,
"acc_stderr": 0.027682979522960234,
"acc_norm": 0.24897959183673468,
"acc_norm_stderr": 0.027682979522960234
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.20481927710843373,
"acc_stderr": 0.03141784291663926,
"acc_norm": 0.20481927710843373,
"acc_norm_stderr": 0.03141784291663926
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.033014059469872514,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.033014059469872514
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.43961123482428127,
"mc2_stderr": 0.01535210261478119
},
"harness|winogrande|5": {
"acc": 0.5011838989739542,
"acc_stderr": 0.014052446290529019
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148674162
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Josephgflowers__distillgpt2Cinder | [
"region:us"
] | 2024-02-04T23:47:54+00:00 | {"pretty_name": "Evaluation run of Josephgflowers/distillgpt2Cinder", "dataset_summary": "Dataset automatically created during the evaluation run of model [Josephgflowers/distillgpt2Cinder](https://huggingface.co/Josephgflowers/distillgpt2Cinder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Josephgflowers__distillgpt2Cinder\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T23:46:27.372796](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__distillgpt2Cinder/blob/main/results_2024-02-04T23-46-27.372796.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24948426094911816,\n \"acc_stderr\": 0.03052443478922094,\n \"acc_norm\": 0.2500035098132215,\n \"acc_norm_stderr\": 0.031309813832633156,\n \"mc1\": 0.24357405140758873,\n \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.43961123482428127,\n \"mc2_stderr\": 0.01535210261478119\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.21075085324232082,\n \"acc_stderr\": 0.01191827175485218,\n \"acc_norm\": 0.24488054607508533,\n \"acc_norm_stderr\": 0.012566273985131356\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2713602867954591,\n \"acc_stderr\": 0.004437527732850682,\n \"acc_norm\": 0.27235610436168095,\n \"acc_norm_stderr\": 0.004442623590846321\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.21710526315789475,\n \"acc_stderr\": 0.033550453048829226,\n \"acc_norm\": 0.21710526315789475,\n \"acc_norm_stderr\": 0.033550453048829226\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.23018867924528302,\n \"acc_stderr\": 0.025907897122408173,\n \"acc_norm\": 0.23018867924528302,\n \"acc_norm_stderr\": 0.025907897122408173\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.16,\n \"acc_stderr\": 0.036845294917747094,\n \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.036845294917747094\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n \"acc_stderr\": 0.03368762932259431,\n \"acc_norm\": 0.2658959537572254,\n \"acc_norm_stderr\": 0.03368762932259431\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617746,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617746\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.20425531914893616,\n \"acc_stderr\": 0.026355158413349424,\n \"acc_norm\": 0.20425531914893616,\n \"acc_norm_stderr\": 0.026355158413349424\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2804232804232804,\n \"acc_stderr\": 0.023135287974325635,\n \"acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.023135287974325635\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.14285714285714285,\n \"acc_stderr\": 0.03129843185743808,\n \"acc_norm\": 0.14285714285714285,\n \"acc_norm_stderr\": 0.03129843185743808\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2709677419354839,\n \"acc_stderr\": 0.02528441611490016,\n \"acc_norm\": 0.2709677419354839,\n \"acc_norm_stderr\": 0.02528441611490016\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3282828282828283,\n \"acc_stderr\": 0.03345678422756776,\n \"acc_norm\": 0.3282828282828283,\n \"acc_norm_stderr\": 0.03345678422756776\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2153846153846154,\n \"acc_stderr\": 0.020843034557462878,\n \"acc_norm\": 0.2153846153846154,\n \"acc_norm_stderr\": 0.020843034557462878\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073838,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073838\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.02702543349888237,\n \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.02702543349888237\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3284403669724771,\n \"acc_stderr\": 0.020135902797298395,\n \"acc_norm\": 0.3284403669724771,\n \"acc_norm_stderr\": 0.020135902797298395\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.028353212866863445,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.028353212866863445\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.14349775784753363,\n \"acc_stderr\": 0.023529371269618193,\n \"acc_norm\": 0.14349775784753363,\n \"acc_norm_stderr\": 0.023529371269618193\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.1984732824427481,\n \"acc_stderr\": 0.03498149385462471,\n \"acc_norm\": 0.1984732824427481,\n \"acc_norm_stderr\": 0.03498149385462471\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.04284467968052191,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.04284467968052191\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.035590395316173425,\n \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.035590395316173425\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1553398058252427,\n \"acc_stderr\": 0.03586594738573973,\n \"acc_norm\": 0.1553398058252427,\n \"acc_norm_stderr\": 0.03586594738573973\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.02860595370200424,\n \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.02860595370200424\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.21328224776500637,\n \"acc_stderr\": 0.014648172749593522,\n \"acc_norm\": 0.21328224776500637,\n \"acc_norm_stderr\": 0.014648172749593522\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.023083658586984204,\n \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.023083658586984204\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.27124183006535946,\n \"acc_stderr\": 0.025457756696667878,\n \"acc_norm\": 0.27124183006535946,\n \"acc_norm_stderr\": 0.025457756696667878\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.2990353697749196,\n \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02492200116888633,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02492200116888633\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24113475177304963,\n \"acc_stderr\": 0.025518731049537776,\n \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.025518731049537776\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.2977941176470588,\n \"acc_stderr\": 0.02777829870154544,\n \"acc_norm\": 0.2977941176470588,\n \"acc_norm_stderr\": 0.02777829870154544\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.272875816993464,\n \"acc_stderr\": 0.018020474148393577,\n \"acc_norm\": 0.272875816993464,\n \"acc_norm_stderr\": 0.018020474148393577\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24897959183673468,\n \"acc_stderr\": 0.027682979522960234,\n \"acc_norm\": 0.24897959183673468,\n \"acc_norm_stderr\": 0.027682979522960234\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.20481927710843373,\n \"acc_stderr\": 0.03141784291663926,\n \"acc_norm\": 0.20481927710843373,\n \"acc_norm_stderr\": 0.03141784291663926\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.033014059469872514,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.033014059469872514\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24357405140758873,\n \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.43961123482428127,\n \"mc2_stderr\": 0.01535210261478119\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5011838989739542,\n \"acc_stderr\": 0.014052446290529019\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \"acc_stderr\": 0.0013121578148674162\n }\n}\n```", "repo_url": "https://huggingface.co/Josephgflowers/distillgpt2Cinder", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|arc:challenge|25_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|gsm8k|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hellaswag|10_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T23-46-27.372796.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["**/details_harness|winogrande|5_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T23-46-27.372796.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T23_46_27.372796", "path": ["results_2024-02-04T23-46-27.372796.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T23-46-27.372796.parquet"]}]}]} | 2024-02-04T23:48:23+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Josephgflowers/distillgpt2Cinder
Dataset automatically created during the evaluation run of model Josephgflowers/distillgpt2Cinder on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T23:46:27.372796(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Josephgflowers/distillgpt2Cinder\n\n\n\nDataset automatically created during the evaluation run of model Josephgflowers/distillgpt2Cinder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T23:46:27.372796(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Josephgflowers/distillgpt2Cinder\n\n\n\nDataset automatically created during the evaluation run of model Josephgflowers/distillgpt2Cinder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T23:46:27.372796(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
be5b02011e6d8752797e75452732c36a3475b657 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
NorQuAD is the first Norwegian question answering dataset for machine reading comprehension, created from scratch in Norwegian. The dataset consists of 4,752 manually created question-answer pairs.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
The dataset provides Norwegian question-answer pairs taken from two data sources: Wikipedia and news.
- **Curated by:** Human annotators.
- **Funded by:** The UiO Teksthub initiative
- **Shared by:** The [Language Technology Group](https://www.mn.uio.no/ifi/english/research/groups/ltg/), University of Oslo
- **Language(s) (NLP):** Norwegian Bokmål
- **License:** CC0-1.0
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** [https://github.com/ltgoslo/NorQuAD](https://github.com/ltgoslo/NorQuAD)
- **Paper:** [Ivanova et. al., 2023](https://aclanthology.org/2023.nodalida-1.17.pdf)
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
The dataset is intended to be used for NLP model development and benchmarking.
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
**Data Instances**
```
{
"id": "1",
"context": "This is a test context",
"question": "This is a question",
"answers": {
"answer_start": [1],
"text": ["This is an answer"]
},
}
```
**Data Fields**
```
id: a string feature.
context: a string feature.
question: a string feature.
answers: a dictionary feature containing:
text: a string feature.
answer_start: a int32 feature.
```
**Dataset Splits**
NorQuAD consists of training (3808 examples), validation (472), and public test (472) sets.
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
Machine reading comprehension is one of the key problems in natural language understanding. The question answering (QA) task requires a machine to read and comprehend a given text passage, and then answer questions about the passage. There is progress in reading comprehension and question answering for English and a few other languages. We would like to fill in the lack of annotated data for question answering for Norwegian. This project aims at compiling human-created training, validation, and test sets for the task for Norwegian.
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
**Wikipedia**: 872 articles were sampled from Norwegian Bokmal Wikipedia.
**News**: For the news category, articles were sampled from Norsk Aviskorpus, an openly available dataset of Norwegian news.
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
**Wikipedia**:In order to include high-quality articles, 130 articles from the
‘Recommended‘ section and 139 from the ‘Featured‘ section were sampled. The remaining 603 articles were randomly sampled from the remaining Wikipedia
corpus. From the sampled articles, we chose only the “Introduction“ sections to be selected as passages for annotation.
**News**: 1000 articles were sampled from the Norsk Aviskorpus (NAK)—a collection of Norwegian news texts
for the year 2019. As was the case with Wikipedia articles, we chose
only news articles which consisted of at least 300
words.
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
The data is sourced from Norwegian Wikipedia dumps as well as the openly available [Norwegian News Corpus](https://www.nb.no/sprakbanken/ressurskatalog/oai-nb-no-sbr-4/), available from the Språkbanken repository.
### Annotations
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
In total, the annotators processed 353 passages from Wikipedia and 403 passages from news, creating a
total of 4,752 question-answer pairs.
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
The dataset was created in three stages: (i) selecting text passages, (ii) collecting question-answer
pairs for those passages, and (iii) human validation of (a subset of) created question-answer pairs.
#### Text selection
Data was selected from openly available sources from Wikipedia and News data, as described above.
#### Question-Answer Pairs
The annotators were provided with a set of initial instructions, largely based on those for similar datasets, in particular, the English SQuAD
dataset (Rajpurkar et al., 2016) and the GermanQuAD data (Moller et al., 2021). These instructions were subsequently refined following regular
meetings with the annotation team.
The annotation guidelines provided to the annotators are available (here)[https://github.com/ltgoslo/NorQuAD/blob/main/guidelines.md].
For annotation, we used the Haystack annotation tool, which was designed for QA collection.
#### Human validation
In a separate stage, the annotators validated a subset of the NorQuAD dataset. In this phase, each
annotator replied to the questions created by the
other annotator. We chose the question-answer
pairs for validation at random. In total, 1378 questions from the set of question-answer pairs were
answered by validators.
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
Two students of the Master’s program in Natural Language Processing at the University of Oslo,
both native Norwegian speakers, created question-answer pairs from the collected passages. Each
student received a separate set of passages for annotation. The students received financial remuneration for their efforts and are co-authors of the
paper describing the resource.
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
```
@inproceedings{
ivanova2023norquad,
title={NorQu{AD}: Norwegian Question Answering Dataset},
author={Sardana Ivanova and Fredrik Aas Andreassen and Matias Jentoft and Sondre Wold and Lilja {\O}vrelid},
booktitle={The 24th Nordic Conference on Computational Linguistics},
year={2023},
url={https://aclanthology.org/2023.nodalida-1.17.pdf}
}
```
**APA:**
[More Information Needed]
## Dataset Card Authors
Vladislav Mikhailov and Lilja Øvrelid
## Dataset Card Contact
[email protected] and [email protected] | ltg/norquad | [
"task_categories:question-answering",
"size_categories:1K<n<10K",
"language:nb",
"license:cc0-1.0",
"region:us"
] | 2024-02-04T23:55:50+00:00 | {"language": ["nb"], "license": "cc0-1.0", "size_categories": ["1K<n<10K"], "task_categories": ["question-answering"], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}], "splits": [{"name": "train", "num_bytes": 8739891, "num_examples": 3808}, {"name": "validation", "num_bytes": 1081237, "num_examples": 472}, {"name": "test", "num_bytes": 1096650, "num_examples": 472}], "download_size": 4188322, "dataset_size": 10917778}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-09T09:07:57+00:00 | [] | [
"nb"
] | TAGS
#task_categories-question-answering #size_categories-1K<n<10K #language-Norwegian Bokmål #license-cc0-1.0 #region-us
|
# Dataset Card for Dataset Name
NorQuAD is the first Norwegian question answering dataset for machine reading comprehension, created from scratch in Norwegian. The dataset consists of 4,752 manually created question-answer pairs.
## Dataset Details
### Dataset Description
The dataset provides Norwegian question-answer pairs taken from two data sources: Wikipedia and news.
- Curated by: Human annotators.
- Funded by: The UiO Teksthub initiative
- Shared by: The Language Technology Group, University of Oslo
- Language(s) (NLP): Norwegian Bokmål
- License: CC0-1.0
### Dataset Sources
- Repository: URL
- Paper: Ivanova et. al., 2023
## Uses
The dataset is intended to be used for NLP model development and benchmarking.
## Dataset Structure
Data Instances
Data Fields
Dataset Splits
NorQuAD consists of training (3808 examples), validation (472), and public test (472) sets.
## Dataset Creation
### Curation Rationale
Machine reading comprehension is one of the key problems in natural language understanding. The question answering (QA) task requires a machine to read and comprehend a given text passage, and then answer questions about the passage. There is progress in reading comprehension and question answering for English and a few other languages. We would like to fill in the lack of annotated data for question answering for Norwegian. This project aims at compiling human-created training, validation, and test sets for the task for Norwegian.
### Source Data
Wikipedia: 872 articles were sampled from Norwegian Bokmal Wikipedia.
News: For the news category, articles were sampled from Norsk Aviskorpus, an openly available dataset of Norwegian news.
#### Data Collection and Processing
Wikipedia:In order to include high-quality articles, 130 articles from the
‘Recommended‘ section and 139 from the ‘Featured‘ section were sampled. The remaining 603 articles were randomly sampled from the remaining Wikipedia
corpus. From the sampled articles, we chose only the “Introduction“ sections to be selected as passages for annotation.
News: 1000 articles were sampled from the Norsk Aviskorpus (NAK)—a collection of Norwegian news texts
for the year 2019. As was the case with Wikipedia articles, we chose
only news articles which consisted of at least 300
words.
#### Who are the source data producers?
The data is sourced from Norwegian Wikipedia dumps as well as the openly available Norwegian News Corpus, available from the Språkbanken repository.
### Annotations
In total, the annotators processed 353 passages from Wikipedia and 403 passages from news, creating a
total of 4,752 question-answer pairs.
#### Annotation process
The dataset was created in three stages: (i) selecting text passages, (ii) collecting question-answer
pairs for those passages, and (iii) human validation of (a subset of) created question-answer pairs.
#### Text selection
Data was selected from openly available sources from Wikipedia and News data, as described above.
#### Question-Answer Pairs
The annotators were provided with a set of initial instructions, largely based on those for similar datasets, in particular, the English SQuAD
dataset (Rajpurkar et al., 2016) and the GermanQuAD data (Moller et al., 2021). These instructions were subsequently refined following regular
meetings with the annotation team.
The annotation guidelines provided to the annotators are available (here)[URL
For annotation, we used the Haystack annotation tool, which was designed for QA collection.
#### Human validation
In a separate stage, the annotators validated a subset of the NorQuAD dataset. In this phase, each
annotator replied to the questions created by the
other annotator. We chose the question-answer
pairs for validation at random. In total, 1378 questions from the set of question-answer pairs were
answered by validators.
#### Who are the annotators?
Two students of the Master’s program in Natural Language Processing at the University of Oslo,
both native Norwegian speakers, created question-answer pairs from the collected passages. Each
student received a separate set of passages for annotation. The students received financial remuneration for their efforts and are co-authors of the
paper describing the resource.
BibTeX:
APA:
## Dataset Card Authors
Vladislav Mikhailov and Lilja Øvrelid
## Dataset Card Contact
vladism@URL and liljao@URL | [
"# Dataset Card for Dataset Name\n\n\n\nNorQuAD is the first Norwegian question answering dataset for machine reading comprehension, created from scratch in Norwegian. The dataset consists of 4,752 manually created question-answer pairs.",
"## Dataset Details",
"### Dataset Description\n\n\nThe dataset provides Norwegian question-answer pairs taken from two data sources: Wikipedia and news.\n\n\n- Curated by: Human annotators.\n- Funded by: The UiO Teksthub initiative\n- Shared by: The Language Technology Group, University of Oslo\n- Language(s) (NLP): Norwegian Bokmål\n- License: CC0-1.0",
"### Dataset Sources\n\n\n\n- Repository: URL\n- Paper: Ivanova et. al., 2023",
"## Uses\n\n\n\nThe dataset is intended to be used for NLP model development and benchmarking.",
"## Dataset Structure\n\n\n\nData Instances\n\n\n\nData Fields\n\n\n\nDataset Splits\n\nNorQuAD consists of training (3808 examples), validation (472), and public test (472) sets.",
"## Dataset Creation",
"### Curation Rationale\n\n\nMachine reading comprehension is one of the key problems in natural language understanding. The question answering (QA) task requires a machine to read and comprehend a given text passage, and then answer questions about the passage. There is progress in reading comprehension and question answering for English and a few other languages. We would like to fill in the lack of annotated data for question answering for Norwegian. This project aims at compiling human-created training, validation, and test sets for the task for Norwegian.",
"### Source Data\n\n\n\nWikipedia: 872 articles were sampled from Norwegian Bokmal Wikipedia. \n\nNews: For the news category, articles were sampled from Norsk Aviskorpus, an openly available dataset of Norwegian news.",
"#### Data Collection and Processing\n\n\n\nWikipedia:In order to include high-quality articles, 130 articles from the\n‘Recommended‘ section and 139 from the ‘Featured‘ section were sampled. The remaining 603 articles were randomly sampled from the remaining Wikipedia\ncorpus. From the sampled articles, we chose only the “Introduction“ sections to be selected as passages for annotation.\n\nNews: 1000 articles were sampled from the Norsk Aviskorpus (NAK)—a collection of Norwegian news texts\nfor the year 2019. As was the case with Wikipedia articles, we chose\nonly news articles which consisted of at least 300\nwords.",
"#### Who are the source data producers?\n\n\n\nThe data is sourced from Norwegian Wikipedia dumps as well as the openly available Norwegian News Corpus, available from the Språkbanken repository.",
"### Annotations\n\n\nIn total, the annotators processed 353 passages from Wikipedia and 403 passages from news, creating a\ntotal of 4,752 question-answer pairs.",
"#### Annotation process\n\n\n\nThe dataset was created in three stages: (i) selecting text passages, (ii) collecting question-answer\npairs for those passages, and (iii) human validation of (a subset of) created question-answer pairs.",
"#### Text selection\n\nData was selected from openly available sources from Wikipedia and News data, as described above.",
"#### Question-Answer Pairs\n\nThe annotators were provided with a set of initial instructions, largely based on those for similar datasets, in particular, the English SQuAD\ndataset (Rajpurkar et al., 2016) and the GermanQuAD data (Moller et al., 2021). These instructions were subsequently refined following regular\nmeetings with the annotation team.\nThe annotation guidelines provided to the annotators are available (here)[URL\nFor annotation, we used the Haystack annotation tool, which was designed for QA collection.",
"#### Human validation\n\nIn a separate stage, the annotators validated a subset of the NorQuAD dataset. In this phase, each\nannotator replied to the questions created by the\nother annotator. We chose the question-answer\npairs for validation at random. In total, 1378 questions from the set of question-answer pairs were\nanswered by validators.",
"#### Who are the annotators?\n\n\n\nTwo students of the Master’s program in Natural Language Processing at the University of Oslo,\nboth native Norwegian speakers, created question-answer pairs from the collected passages. Each\nstudent received a separate set of passages for annotation. The students received financial remuneration for their efforts and are co-authors of the\npaper describing the resource. \n\n\n\nBibTeX:\n\nAPA:",
"## Dataset Card Authors\n\nVladislav Mikhailov and Lilja Øvrelid",
"## Dataset Card Contact\n\nvladism@URL and liljao@URL"
] | [
"TAGS\n#task_categories-question-answering #size_categories-1K<n<10K #language-Norwegian Bokmål #license-cc0-1.0 #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nNorQuAD is the first Norwegian question answering dataset for machine reading comprehension, created from scratch in Norwegian. The dataset consists of 4,752 manually created question-answer pairs.",
"## Dataset Details",
"### Dataset Description\n\n\nThe dataset provides Norwegian question-answer pairs taken from two data sources: Wikipedia and news.\n\n\n- Curated by: Human annotators.\n- Funded by: The UiO Teksthub initiative\n- Shared by: The Language Technology Group, University of Oslo\n- Language(s) (NLP): Norwegian Bokmål\n- License: CC0-1.0",
"### Dataset Sources\n\n\n\n- Repository: URL\n- Paper: Ivanova et. al., 2023",
"## Uses\n\n\n\nThe dataset is intended to be used for NLP model development and benchmarking.",
"## Dataset Structure\n\n\n\nData Instances\n\n\n\nData Fields\n\n\n\nDataset Splits\n\nNorQuAD consists of training (3808 examples), validation (472), and public test (472) sets.",
"## Dataset Creation",
"### Curation Rationale\n\n\nMachine reading comprehension is one of the key problems in natural language understanding. The question answering (QA) task requires a machine to read and comprehend a given text passage, and then answer questions about the passage. There is progress in reading comprehension and question answering for English and a few other languages. We would like to fill in the lack of annotated data for question answering for Norwegian. This project aims at compiling human-created training, validation, and test sets for the task for Norwegian.",
"### Source Data\n\n\n\nWikipedia: 872 articles were sampled from Norwegian Bokmal Wikipedia. \n\nNews: For the news category, articles were sampled from Norsk Aviskorpus, an openly available dataset of Norwegian news.",
"#### Data Collection and Processing\n\n\n\nWikipedia:In order to include high-quality articles, 130 articles from the\n‘Recommended‘ section and 139 from the ‘Featured‘ section were sampled. The remaining 603 articles were randomly sampled from the remaining Wikipedia\ncorpus. From the sampled articles, we chose only the “Introduction“ sections to be selected as passages for annotation.\n\nNews: 1000 articles were sampled from the Norsk Aviskorpus (NAK)—a collection of Norwegian news texts\nfor the year 2019. As was the case with Wikipedia articles, we chose\nonly news articles which consisted of at least 300\nwords.",
"#### Who are the source data producers?\n\n\n\nThe data is sourced from Norwegian Wikipedia dumps as well as the openly available Norwegian News Corpus, available from the Språkbanken repository.",
"### Annotations\n\n\nIn total, the annotators processed 353 passages from Wikipedia and 403 passages from news, creating a\ntotal of 4,752 question-answer pairs.",
"#### Annotation process\n\n\n\nThe dataset was created in three stages: (i) selecting text passages, (ii) collecting question-answer\npairs for those passages, and (iii) human validation of (a subset of) created question-answer pairs.",
"#### Text selection\n\nData was selected from openly available sources from Wikipedia and News data, as described above.",
"#### Question-Answer Pairs\n\nThe annotators were provided with a set of initial instructions, largely based on those for similar datasets, in particular, the English SQuAD\ndataset (Rajpurkar et al., 2016) and the GermanQuAD data (Moller et al., 2021). These instructions were subsequently refined following regular\nmeetings with the annotation team.\nThe annotation guidelines provided to the annotators are available (here)[URL\nFor annotation, we used the Haystack annotation tool, which was designed for QA collection.",
"#### Human validation\n\nIn a separate stage, the annotators validated a subset of the NorQuAD dataset. In this phase, each\nannotator replied to the questions created by the\nother annotator. We chose the question-answer\npairs for validation at random. In total, 1378 questions from the set of question-answer pairs were\nanswered by validators.",
"#### Who are the annotators?\n\n\n\nTwo students of the Master’s program in Natural Language Processing at the University of Oslo,\nboth native Norwegian speakers, created question-answer pairs from the collected passages. Each\nstudent received a separate set of passages for annotation. The students received financial remuneration for their efforts and are co-authors of the\npaper describing the resource. \n\n\n\nBibTeX:\n\nAPA:",
"## Dataset Card Authors\n\nVladislav Mikhailov and Lilja Øvrelid",
"## Dataset Card Contact\n\nvladism@URL and liljao@URL"
] |
0da248bbb455ac905b6caab21f3ffb3f12eceb99 |
# Dataset Card for Evaluation run of AIGym/deepseek-coder-1.3b-chat-and-function-calling
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AIGym/deepseek-coder-1.3b-chat-and-function-calling](https://huggingface.co/AIGym/deepseek-coder-1.3b-chat-and-function-calling) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AIGym__deepseek-coder-1.3b-chat-and-function-calling",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T23:54:42.230347](https://huggingface.co/datasets/open-llm-leaderboard/details_AIGym__deepseek-coder-1.3b-chat-and-function-calling/blob/main/results_2024-02-04T23-54-42.230347.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26966838532529325,
"acc_stderr": 0.03152856048648718,
"acc_norm": 0.2711365940971498,
"acc_norm_stderr": 0.03228813751443345,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766372,
"mc2": 0.43371704166991554,
"mc2_stderr": 0.01505484479340333
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132865,
"acc_norm": 0.2627986348122867,
"acc_norm_stderr": 0.012862523175351333
},
"harness|hellaswag|10": {
"acc": 0.3301135232025493,
"acc_stderr": 0.004692926794268451,
"acc_norm": 0.3926508663612826,
"acc_norm_stderr": 0.004873421833291587
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2641509433962264,
"acc_stderr": 0.027134291628741702,
"acc_norm": 0.2641509433962264,
"acc_norm_stderr": 0.027134291628741702
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2986111111111111,
"acc_stderr": 0.038270523579507554,
"acc_norm": 0.2986111111111111,
"acc_norm_stderr": 0.038270523579507554
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.16184971098265896,
"acc_stderr": 0.028083594279575765,
"acc_norm": 0.16184971098265896,
"acc_norm_stderr": 0.028083594279575765
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.30638297872340425,
"acc_stderr": 0.030135906478517563,
"acc_norm": 0.30638297872340425,
"acc_norm_stderr": 0.030135906478517563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3310344827586207,
"acc_stderr": 0.03921545312467122,
"acc_norm": 0.3310344827586207,
"acc_norm_stderr": 0.03921545312467122
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.02345603738398202,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.02345603738398202
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.03619604524124251,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.03619604524124251
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2709677419354839,
"acc_stderr": 0.02528441611490016,
"acc_norm": 0.2709677419354839,
"acc_norm_stderr": 0.02528441611490016
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694433,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694433
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.20606060606060606,
"acc_stderr": 0.03158415324047709,
"acc_norm": 0.20606060606060606,
"acc_norm_stderr": 0.03158415324047709
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23737373737373738,
"acc_stderr": 0.03031371053819889,
"acc_norm": 0.23737373737373738,
"acc_norm_stderr": 0.03031371053819889
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2538860103626943,
"acc_stderr": 0.03141024780565319,
"acc_norm": 0.2538860103626943,
"acc_norm_stderr": 0.03141024780565319
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23846153846153847,
"acc_stderr": 0.021606294494647727,
"acc_norm": 0.23846153846153847,
"acc_norm_stderr": 0.021606294494647727
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.026265024608275882,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.026265024608275882
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24770642201834864,
"acc_stderr": 0.018508143602547836,
"acc_norm": 0.24770642201834864,
"acc_norm_stderr": 0.018508143602547836
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.032282103870378914,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.032282103870378914
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.029571601065753374,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.029571601065753374
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.26905829596412556,
"acc_stderr": 0.029763779406874975,
"acc_norm": 0.26905829596412556,
"acc_norm_stderr": 0.029763779406874975
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.30578512396694213,
"acc_stderr": 0.04205953933884124,
"acc_norm": 0.30578512396694213,
"acc_norm_stderr": 0.04205953933884124
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25766871165644173,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.25766871165644173,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3162393162393162,
"acc_stderr": 0.030463656747340265,
"acc_norm": 0.3162393162393162,
"acc_norm_stderr": 0.030463656747340265
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28607918263090676,
"acc_stderr": 0.016160871405127536,
"acc_norm": 0.28607918263090676,
"acc_norm_stderr": 0.016160871405127536
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26878612716763006,
"acc_stderr": 0.023868003262500118,
"acc_norm": 0.26878612716763006,
"acc_norm_stderr": 0.023868003262500118
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26145251396648045,
"acc_stderr": 0.014696599650364546,
"acc_norm": 0.26145251396648045,
"acc_norm_stderr": 0.014696599650364546
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2347266881028939,
"acc_stderr": 0.024071805887677045,
"acc_norm": 0.2347266881028939,
"acc_norm_stderr": 0.024071805887677045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432414,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432414
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2711864406779661,
"acc_stderr": 0.011354581451622985,
"acc_norm": 0.2711864406779661,
"acc_norm_stderr": 0.011354581451622985
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.33455882352941174,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.33455882352941174,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.017740899509177788,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.017740899509177788
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.031157150869355568,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.031157150869355568
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511115,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511115
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.03401052620104089,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.03401052620104089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766372,
"mc2": 0.43371704166991554,
"mc2_stderr": 0.01505484479340333
},
"harness|winogrande|5": {
"acc": 0.5169692186266772,
"acc_stderr": 0.014044390401612976
},
"harness|gsm8k|5": {
"acc": 0.03411675511751327,
"acc_stderr": 0.00500021260077329
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AIGym__deepseek-coder-1.3b-chat-and-function-calling | [
"region:us"
] | 2024-02-04T23:56:58+00:00 | {"pretty_name": "Evaluation run of AIGym/deepseek-coder-1.3b-chat-and-function-calling", "dataset_summary": "Dataset automatically created during the evaluation run of model [AIGym/deepseek-coder-1.3b-chat-and-function-calling](https://huggingface.co/AIGym/deepseek-coder-1.3b-chat-and-function-calling) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AIGym__deepseek-coder-1.3b-chat-and-function-calling\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T23:54:42.230347](https://huggingface.co/datasets/open-llm-leaderboard/details_AIGym__deepseek-coder-1.3b-chat-and-function-calling/blob/main/results_2024-02-04T23-54-42.230347.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26966838532529325,\n \"acc_stderr\": 0.03152856048648718,\n \"acc_norm\": 0.2711365940971498,\n \"acc_norm_stderr\": 0.03228813751443345,\n \"mc1\": 0.2607099143206854,\n \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.43371704166991554,\n \"mc2_stderr\": 0.01505484479340333\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n \"acc_stderr\": 0.012240491536132865,\n \"acc_norm\": 0.2627986348122867,\n \"acc_norm_stderr\": 0.012862523175351333\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3301135232025493,\n \"acc_stderr\": 0.004692926794268451,\n \"acc_norm\": 0.3926508663612826,\n \"acc_norm_stderr\": 0.004873421833291587\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.027134291628741702,\n \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.027134291628741702\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2986111111111111,\n \"acc_stderr\": 0.038270523579507554,\n \"acc_norm\": 0.2986111111111111,\n \"acc_norm_stderr\": 0.038270523579507554\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.16184971098265896,\n \"acc_stderr\": 0.028083594279575765,\n \"acc_norm\": 0.16184971098265896,\n \"acc_norm_stderr\": 0.028083594279575765\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.30638297872340425,\n \"acc_stderr\": 0.030135906478517563,\n \"acc_norm\": 0.30638297872340425,\n \"acc_norm_stderr\": 0.030135906478517563\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.3310344827586207,\n \"acc_stderr\": 0.03921545312467122,\n \"acc_norm\": 0.3310344827586207,\n \"acc_norm_stderr\": 0.03921545312467122\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.02345603738398202,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.02345603738398202\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n \"acc_stderr\": 0.03619604524124251,\n \"acc_norm\": 0.20634920634920634,\n \"acc_norm_stderr\": 0.03619604524124251\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2709677419354839,\n \"acc_stderr\": 0.02528441611490016,\n \"acc_norm\": 0.2709677419354839,\n \"acc_norm_stderr\": 0.02528441611490016\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694433,\n \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694433\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.20606060606060606,\n \"acc_stderr\": 0.03158415324047709,\n \"acc_norm\": 0.20606060606060606,\n \"acc_norm_stderr\": 0.03158415324047709\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.23737373737373738,\n \"acc_stderr\": 0.03031371053819889,\n \"acc_norm\": 0.23737373737373738,\n \"acc_norm_stderr\": 0.03031371053819889\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.2538860103626943,\n \"acc_stderr\": 0.03141024780565319,\n \"acc_norm\": 0.2538860103626943,\n \"acc_norm_stderr\": 0.03141024780565319\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.23846153846153847,\n \"acc_stderr\": 0.021606294494647727,\n \"acc_norm\": 0.23846153846153847,\n \"acc_norm_stderr\": 0.021606294494647727\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.026265024608275882,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.026265024608275882\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.24770642201834864,\n \"acc_stderr\": 0.018508143602547836,\n \"acc_norm\": 0.24770642201834864,\n \"acc_norm_stderr\": 0.018508143602547836\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.37962962962962965,\n \"acc_stderr\": 0.03309682581119035,\n \"acc_norm\": 0.37962962962962965,\n \"acc_norm_stderr\": 0.03309682581119035\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.032282103870378914,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.032282103870378914\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2911392405063291,\n \"acc_stderr\": 0.029571601065753374,\n \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.029571601065753374\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.26905829596412556,\n \"acc_stderr\": 0.029763779406874975,\n \"acc_norm\": 0.26905829596412556,\n \"acc_norm_stderr\": 0.029763779406874975\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.30578512396694213,\n \"acc_stderr\": 0.04205953933884124,\n \"acc_norm\": 0.30578512396694213,\n \"acc_norm_stderr\": 0.04205953933884124\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.044532548363264673,\n \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.044532548363264673\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3162393162393162,\n \"acc_stderr\": 0.030463656747340265,\n \"acc_norm\": 0.3162393162393162,\n \"acc_norm_stderr\": 0.030463656747340265\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28607918263090676,\n \"acc_stderr\": 0.016160871405127536,\n \"acc_norm\": 0.28607918263090676,\n \"acc_norm_stderr\": 0.016160871405127536\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.26878612716763006,\n \"acc_stderr\": 0.023868003262500118,\n \"acc_norm\": 0.26878612716763006,\n \"acc_norm_stderr\": 0.023868003262500118\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26145251396648045,\n \"acc_stderr\": 0.014696599650364546,\n \"acc_norm\": 0.26145251396648045,\n \"acc_norm_stderr\": 0.014696599650364546\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875192,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875192\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2347266881028939,\n \"acc_stderr\": 0.024071805887677045,\n \"acc_norm\": 0.2347266881028939,\n \"acc_norm_stderr\": 0.024071805887677045\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432414,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432414\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2711864406779661,\n \"acc_stderr\": 0.011354581451622985,\n \"acc_norm\": 0.2711864406779661,\n \"acc_norm_stderr\": 0.011354581451622985\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.33455882352941174,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.33455882352941174,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.017740899509177788,\n \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.017740899509177788\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.025409301953225678,\n \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.025409301953225678\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n \"acc_stderr\": 0.031157150869355568,\n \"acc_norm\": 0.263681592039801,\n \"acc_norm_stderr\": 0.031157150869355568\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2891566265060241,\n \"acc_stderr\": 0.03529486801511115,\n \"acc_norm\": 0.2891566265060241,\n \"acc_norm_stderr\": 0.03529486801511115\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.26900584795321636,\n \"acc_stderr\": 0.03401052620104089,\n \"acc_norm\": 0.26900584795321636,\n \"acc_norm_stderr\": 0.03401052620104089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.43371704166991554,\n \"mc2_stderr\": 0.01505484479340333\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5169692186266772,\n \"acc_stderr\": 0.014044390401612976\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03411675511751327,\n \"acc_stderr\": 0.00500021260077329\n }\n}\n```", "repo_url": "https://huggingface.co/AIGym/deepseek-coder-1.3b-chat-and-function-calling", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|arc:challenge|25_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|gsm8k|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hellaswag|10_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T23-54-42.230347.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["**/details_harness|winogrande|5_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T23-54-42.230347.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T23_54_42.230347", "path": ["results_2024-02-04T23-54-42.230347.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T23-54-42.230347.parquet"]}]}]} | 2024-02-04T23:57:25+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AIGym/deepseek-coder-1.3b-chat-and-function-calling
Dataset automatically created during the evaluation run of model AIGym/deepseek-coder-1.3b-chat-and-function-calling on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T23:54:42.230347(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AIGym/deepseek-coder-1.3b-chat-and-function-calling\n\n\n\nDataset automatically created during the evaluation run of model AIGym/deepseek-coder-1.3b-chat-and-function-calling on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T23:54:42.230347(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AIGym/deepseek-coder-1.3b-chat-and-function-calling\n\n\n\nDataset automatically created during the evaluation run of model AIGym/deepseek-coder-1.3b-chat-and-function-calling on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T23:54:42.230347(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
0c983a9912004ea421183cefde425e7333d1dfce |
# Dataset Card for Evaluation run of AIGym/deepseek-coder-6.7b-chat-and-function-calling
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AIGym/deepseek-coder-6.7b-chat-and-function-calling](https://huggingface.co/AIGym/deepseek-coder-6.7b-chat-and-function-calling) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AIGym__deepseek-coder-6.7b-chat-and-function-calling",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T23:57:59.059131](https://huggingface.co/datasets/open-llm-leaderboard/details_AIGym__deepseek-coder-6.7b-chat-and-function-calling/blob/main/results_2024-02-04T23-57-59.059131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3821937199083958,
"acc_stderr": 0.0343357567097525,
"acc_norm": 0.385141155502206,
"acc_norm_stderr": 0.035093328270488326,
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557982,
"mc2": 0.4283133114995368,
"mc2_stderr": 0.01473438764724853
},
"harness|arc:challenge|25": {
"acc": 0.33276450511945393,
"acc_stderr": 0.013769863046192312,
"acc_norm": 0.3609215017064846,
"acc_norm_stderr": 0.01403476138617546
},
"harness|hellaswag|10": {
"acc": 0.41236805417247563,
"acc_stderr": 0.0049125470401328785,
"acc_norm": 0.5380402310296754,
"acc_norm_stderr": 0.004975319435777099
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.35526315789473684,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.35526315789473684,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.41509433962264153,
"acc_stderr": 0.030325945789286105,
"acc_norm": 0.41509433962264153,
"acc_norm_stderr": 0.030325945789286105
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3263888888888889,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.3263888888888889,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3352601156069364,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.3352601156069364,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.02422996529842508,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.02422996529842508
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.38387096774193546,
"acc_stderr": 0.02766618207553964,
"acc_norm": 0.38387096774193546,
"acc_norm_stderr": 0.02766618207553964
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233484,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233484
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4,
"acc_stderr": 0.03825460278380025,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03825460278380025
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4292929292929293,
"acc_stderr": 0.035265527246011986,
"acc_norm": 0.4292929292929293,
"acc_norm_stderr": 0.035265527246011986
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.39896373056994816,
"acc_stderr": 0.03533999094065696,
"acc_norm": 0.39896373056994816,
"acc_norm_stderr": 0.03533999094065696
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.33076923076923076,
"acc_stderr": 0.023854795680971128,
"acc_norm": 0.33076923076923076,
"acc_norm_stderr": 0.023854795680971128
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3697478991596639,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.3697478991596639,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3706422018348624,
"acc_stderr": 0.020707458164352984,
"acc_norm": 0.3706422018348624,
"acc_norm_stderr": 0.020707458164352984
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0321495214780275,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0321495214780275
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.03354092437591518,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.03354092437591518
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.35864978902953587,
"acc_stderr": 0.031219569445301843,
"acc_norm": 0.35864978902953587,
"acc_norm_stderr": 0.031219569445301843
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3901345291479821,
"acc_stderr": 0.03273766725459156,
"acc_norm": 0.3901345291479821,
"acc_norm_stderr": 0.03273766725459156
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.46564885496183206,
"acc_stderr": 0.043749285605997376,
"acc_norm": 0.46564885496183206,
"acc_norm_stderr": 0.043749285605997376
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.512396694214876,
"acc_stderr": 0.04562951548180765,
"acc_norm": 0.512396694214876,
"acc_norm_stderr": 0.04562951548180765
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4294478527607362,
"acc_stderr": 0.038890666191127216,
"acc_norm": 0.4294478527607362,
"acc_norm_stderr": 0.038890666191127216
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.3883495145631068,
"acc_stderr": 0.048257293373563895,
"acc_norm": 0.3883495145631068,
"acc_norm_stderr": 0.048257293373563895
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6495726495726496,
"acc_stderr": 0.0312561082442188,
"acc_norm": 0.6495726495726496,
"acc_norm_stderr": 0.0312561082442188
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4061302681992337,
"acc_stderr": 0.017562037406478916,
"acc_norm": 0.4061302681992337,
"acc_norm_stderr": 0.017562037406478916
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4190751445086705,
"acc_stderr": 0.026564178111422622,
"acc_norm": 0.4190751445086705,
"acc_norm_stderr": 0.026564178111422622
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29608938547486036,
"acc_stderr": 0.015268677317602257,
"acc_norm": 0.29608938547486036,
"acc_norm_stderr": 0.015268677317602257
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4084967320261438,
"acc_stderr": 0.02814640599309636,
"acc_norm": 0.4084967320261438,
"acc_norm_stderr": 0.02814640599309636
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4115755627009646,
"acc_stderr": 0.02795048149440126,
"acc_norm": 0.4115755627009646,
"acc_norm_stderr": 0.02795048149440126
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.028338017428611334,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.028338017428611334
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2926988265971317,
"acc_stderr": 0.01162094919584953,
"acc_norm": 0.2926988265971317,
"acc_norm_stderr": 0.01162094919584953
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41911764705882354,
"acc_stderr": 0.029972807170464622,
"acc_norm": 0.41911764705882354,
"acc_norm_stderr": 0.029972807170464622
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.32516339869281047,
"acc_stderr": 0.018950886770806304,
"acc_norm": 0.32516339869281047,
"acc_norm_stderr": 0.018950886770806304
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4818181818181818,
"acc_stderr": 0.04785964010794917,
"acc_norm": 0.4818181818181818,
"acc_norm_stderr": 0.04785964010794917
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.46530612244897956,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.46530612244897956,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.43781094527363185,
"acc_stderr": 0.0350808011219984,
"acc_norm": 0.43781094527363185,
"acc_norm_stderr": 0.0350808011219984
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.03733756969066163,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.03733756969066163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557982,
"mc2": 0.4283133114995368,
"mc2_stderr": 0.01473438764724853
},
"harness|winogrande|5": {
"acc": 0.5722178374112076,
"acc_stderr": 0.013905134013839944
},
"harness|gsm8k|5": {
"acc": 0.17210007581501138,
"acc_stderr": 0.010397328057879003
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AIGym__deepseek-coder-6.7b-chat-and-function-calling | [
"region:us"
] | 2024-02-05T00:00:17+00:00 | {"pretty_name": "Evaluation run of AIGym/deepseek-coder-6.7b-chat-and-function-calling", "dataset_summary": "Dataset automatically created during the evaluation run of model [AIGym/deepseek-coder-6.7b-chat-and-function-calling](https://huggingface.co/AIGym/deepseek-coder-6.7b-chat-and-function-calling) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AIGym__deepseek-coder-6.7b-chat-and-function-calling\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-04T23:57:59.059131](https://huggingface.co/datasets/open-llm-leaderboard/details_AIGym__deepseek-coder-6.7b-chat-and-function-calling/blob/main/results_2024-02-04T23-57-59.059131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3821937199083958,\n \"acc_stderr\": 0.0343357567097525,\n \"acc_norm\": 0.385141155502206,\n \"acc_norm_stderr\": 0.035093328270488326,\n \"mc1\": 0.2594859241126071,\n \"mc1_stderr\": 0.015345409485557982,\n \"mc2\": 0.4283133114995368,\n \"mc2_stderr\": 0.01473438764724853\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.33276450511945393,\n \"acc_stderr\": 0.013769863046192312,\n \"acc_norm\": 0.3609215017064846,\n \"acc_norm_stderr\": 0.01403476138617546\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.41236805417247563,\n \"acc_stderr\": 0.0049125470401328785,\n \"acc_norm\": 0.5380402310296754,\n \"acc_norm_stderr\": 0.004975319435777099\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37777777777777777,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.35526315789473684,\n \"acc_stderr\": 0.038947344870133176,\n \"acc_norm\": 0.35526315789473684,\n \"acc_norm_stderr\": 0.038947344870133176\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.41509433962264153,\n \"acc_stderr\": 0.030325945789286105,\n \"acc_norm\": 0.41509433962264153,\n \"acc_norm_stderr\": 0.030325945789286105\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3263888888888889,\n \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.3263888888888889,\n \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3352601156069364,\n \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.3352601156069364,\n \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231,\n \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3306878306878307,\n \"acc_stderr\": 0.02422996529842508,\n \"acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.02422996529842508\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.38387096774193546,\n \"acc_stderr\": 0.02766618207553964,\n \"acc_norm\": 0.38387096774193546,\n \"acc_norm_stderr\": 0.02766618207553964\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233484,\n \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233484\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03825460278380025,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03825460278380025\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4292929292929293,\n \"acc_stderr\": 0.035265527246011986,\n \"acc_norm\": 0.4292929292929293,\n \"acc_norm_stderr\": 0.035265527246011986\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.39896373056994816,\n \"acc_stderr\": 0.03533999094065696,\n \"acc_norm\": 0.39896373056994816,\n \"acc_norm_stderr\": 0.03533999094065696\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.33076923076923076,\n \"acc_stderr\": 0.023854795680971128,\n \"acc_norm\": 0.33076923076923076,\n \"acc_norm_stderr\": 0.023854795680971128\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3697478991596639,\n \"acc_stderr\": 0.03135709599613591,\n \"acc_norm\": 0.3697478991596639,\n \"acc_norm_stderr\": 0.03135709599613591\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3706422018348624,\n \"acc_stderr\": 0.020707458164352984,\n \"acc_norm\": 0.3706422018348624,\n \"acc_norm_stderr\": 0.020707458164352984\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.0321495214780275,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.0321495214780275\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.03354092437591518,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.03354092437591518\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.35864978902953587,\n \"acc_stderr\": 0.031219569445301843,\n \"acc_norm\": 0.35864978902953587,\n \"acc_norm_stderr\": 0.031219569445301843\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3901345291479821,\n \"acc_stderr\": 0.03273766725459156,\n \"acc_norm\": 0.3901345291479821,\n \"acc_norm_stderr\": 0.03273766725459156\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.46564885496183206,\n \"acc_stderr\": 0.043749285605997376,\n \"acc_norm\": 0.46564885496183206,\n \"acc_norm_stderr\": 0.043749285605997376\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.512396694214876,\n \"acc_stderr\": 0.04562951548180765,\n \"acc_norm\": 0.512396694214876,\n \"acc_norm_stderr\": 0.04562951548180765\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4294478527607362,\n \"acc_stderr\": 0.038890666191127216,\n \"acc_norm\": 0.4294478527607362,\n \"acc_norm_stderr\": 0.038890666191127216\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3883495145631068,\n \"acc_stderr\": 0.048257293373563895,\n \"acc_norm\": 0.3883495145631068,\n \"acc_norm_stderr\": 0.048257293373563895\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6495726495726496,\n \"acc_stderr\": 0.0312561082442188,\n \"acc_norm\": 0.6495726495726496,\n \"acc_norm_stderr\": 0.0312561082442188\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4061302681992337,\n \"acc_stderr\": 0.017562037406478916,\n \"acc_norm\": 0.4061302681992337,\n \"acc_norm_stderr\": 0.017562037406478916\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4190751445086705,\n \"acc_stderr\": 0.026564178111422622,\n \"acc_norm\": 0.4190751445086705,\n \"acc_norm_stderr\": 0.026564178111422622\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29608938547486036,\n \"acc_stderr\": 0.015268677317602257,\n \"acc_norm\": 0.29608938547486036,\n \"acc_norm_stderr\": 0.015268677317602257\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4084967320261438,\n \"acc_stderr\": 0.02814640599309636,\n \"acc_norm\": 0.4084967320261438,\n \"acc_norm_stderr\": 0.02814640599309636\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4115755627009646,\n \"acc_stderr\": 0.02795048149440126,\n \"acc_norm\": 0.4115755627009646,\n \"acc_norm_stderr\": 0.02795048149440126\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.02517104191530968,\n \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.02517104191530968\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.34397163120567376,\n \"acc_stderr\": 0.028338017428611334,\n \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.028338017428611334\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2926988265971317,\n \"acc_stderr\": 0.01162094919584953,\n \"acc_norm\": 0.2926988265971317,\n \"acc_norm_stderr\": 0.01162094919584953\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.41911764705882354,\n \"acc_stderr\": 0.029972807170464622,\n \"acc_norm\": 0.41911764705882354,\n \"acc_norm_stderr\": 0.029972807170464622\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.32516339869281047,\n \"acc_stderr\": 0.018950886770806304,\n \"acc_norm\": 0.32516339869281047,\n \"acc_norm_stderr\": 0.018950886770806304\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4818181818181818,\n \"acc_stderr\": 0.04785964010794917,\n \"acc_norm\": 0.4818181818181818,\n \"acc_norm_stderr\": 0.04785964010794917\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.46530612244897956,\n \"acc_stderr\": 0.03193207024425314,\n \"acc_norm\": 0.46530612244897956,\n \"acc_norm_stderr\": 0.03193207024425314\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.43781094527363185,\n \"acc_stderr\": 0.0350808011219984,\n \"acc_norm\": 0.43781094527363185,\n \"acc_norm_stderr\": 0.0350808011219984\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.38596491228070173,\n \"acc_stderr\": 0.03733756969066163,\n \"acc_norm\": 0.38596491228070173,\n \"acc_norm_stderr\": 0.03733756969066163\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2594859241126071,\n \"mc1_stderr\": 0.015345409485557982,\n \"mc2\": 0.4283133114995368,\n \"mc2_stderr\": 0.01473438764724853\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5722178374112076,\n \"acc_stderr\": 0.013905134013839944\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17210007581501138,\n \"acc_stderr\": 0.010397328057879003\n }\n}\n```", "repo_url": "https://huggingface.co/AIGym/deepseek-coder-6.7b-chat-and-function-calling", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|arc:challenge|25_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|gsm8k|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hellaswag|10_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-04T23-57-59.059131.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["**/details_harness|winogrande|5_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-04T23-57-59.059131.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_04T23_57_59.059131", "path": ["results_2024-02-04T23-57-59.059131.parquet"]}, {"split": "latest", "path": ["results_2024-02-04T23-57-59.059131.parquet"]}]}]} | 2024-02-05T00:00:39+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AIGym/deepseek-coder-6.7b-chat-and-function-calling
Dataset automatically created during the evaluation run of model AIGym/deepseek-coder-6.7b-chat-and-function-calling on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-04T23:57:59.059131(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AIGym/deepseek-coder-6.7b-chat-and-function-calling\n\n\n\nDataset automatically created during the evaluation run of model AIGym/deepseek-coder-6.7b-chat-and-function-calling on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T23:57:59.059131(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AIGym/deepseek-coder-6.7b-chat-and-function-calling\n\n\n\nDataset automatically created during the evaluation run of model AIGym/deepseek-coder-6.7b-chat-and-function-calling on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-04T23:57:59.059131(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
95bcc99b9c4d0016fe2b7865194de79119edd0bc |
# medmcqa-rephrased-20k-v0.1
(Rephrased Sample) MedMCQA: A Large-scale Multi-Subject Multi-Choice Dataset for Medical domain Question Answering
## Dataset Details
### Dataset Description
MedMCQA is a large-scale, Multiple-Choice Question Answering (MCQA) dataset designed to address real-world medical entrance exam questions.
MedMCQA has more than 194k high-quality AIIMS & NEET PG entrance exam MCQs covering 2.4k healthcare topics and 21 medical subjects are collected with an average token length of 12.77 and high topical diversity.
`medmcqa-rephrased-20k-v0.1` contains 20,000 random samples of MedMCQA questions rephrased into different multiple-choice formats using [`mistralai/Mixtral-8x7B-Instruct-v0.1`](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1).
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** https://github.com/medmcqa/medmcqa
- **Paper:** Pal et al. (2022) [MedMCQA : A Large-scale Multi-Subject Multi-Choice Dataset for Medical domain Question Answering
](https://arxiv.org/abs/2203.14371).
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
```bibtex
@InProceedings{pmlr-v174-pal22a,
title = {MedMCQA: A Large-scale Multi-Subject Multi-Choice Dataset for Medical domain Question Answering},
author = {Pal, Ankit and Umapathi, Logesh Kumar and Sankarasubbu, Malaikannan},
booktitle = {Proceedings of the Conference on Health, Inference, and Learning},
pages = {248--260},
year = {2022},
editor = {Flores, Gerardo and Chen, George H and Pollard, Tom and Ho, Joyce C and Naumann, Tristan},
volume = {174},
series = {Proceedings of Machine Learning Research},
month = {07--08 Apr},
publisher = {PMLR},
pdf = {https://proceedings.mlr.press/v174/pal22a/pal22a.pdf},
url = {https://proceedings.mlr.press/v174/pal22a.html},
abstract = {This paper introduces MedMCQA, a new large-scale, Multiple-Choice Question Answering (MCQA) dataset designed to address real-world medical entrance exam questions. More than 194k high-quality AIIMS & NEET PG entrance exam MCQs covering 2.4k healthcare topics and 21 medical subjects are collected with an average token length of 12.77 and high topical diversity. Each sample contains a question, correct answer(s), and other options which requires a deeper language understanding as it tests the 10+ reasoning abilities of a model across a wide range of medical subjects & topics. A detailed explanation of the solution, along with the above information, is provided in this study.}
}
@misc{medmcqa-rephrased-20k-v0.1,
title = {MedMCQA Rephrased (20k Samples) v0.1},
author = {Jonathan Tow},
year = {2024},
publisher = {HuggingFace},
url = {https://huggingface.co/datasets/jon-tow/medmcqa-rephrased-20k-v0.1}
}
``` | jon-tow/medmcqa-rephrased-20k-v0.1 | [
"license:apache-2.0",
"arxiv:2203.14371",
"region:us"
] | 2024-02-05T01:34:19+00:00 | {"license": "apache-2.0", "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 8269606, "num_examples": 20000}], "download_size": 4941825, "dataset_size": 8269606}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-05T19:08:45+00:00 | [
"2203.14371"
] | [] | TAGS
#license-apache-2.0 #arxiv-2203.14371 #region-us
|
# medmcqa-rephrased-20k-v0.1
(Rephrased Sample) MedMCQA: A Large-scale Multi-Subject Multi-Choice Dataset for Medical domain Question Answering
## Dataset Details
### Dataset Description
MedMCQA is a large-scale, Multiple-Choice Question Answering (MCQA) dataset designed to address real-world medical entrance exam questions.
MedMCQA has more than 194k high-quality AIIMS & NEET PG entrance exam MCQs covering 2.4k healthcare topics and 21 medical subjects are collected with an average token length of 12.77 and high topical diversity.
'medmcqa-rephrased-20k-v0.1' contains 20,000 random samples of MedMCQA questions rephrased into different multiple-choice formats using 'mistralai/Mixtral-8x7B-Instruct-v0.1'.
### Dataset Sources
- Repository: URL
- Paper: Pal et al. (2022) MedMCQA : A Large-scale Multi-Subject Multi-Choice Dataset for Medical domain Question Answering
.
| [
"# medmcqa-rephrased-20k-v0.1 \n\n(Rephrased Sample) MedMCQA: A Large-scale Multi-Subject Multi-Choice Dataset for Medical domain Question Answering",
"## Dataset Details",
"### Dataset Description\n\nMedMCQA is a large-scale, Multiple-Choice Question Answering (MCQA) dataset designed to address real-world medical entrance exam questions.\nMedMCQA has more than 194k high-quality AIIMS & NEET PG entrance exam MCQs covering 2.4k healthcare topics and 21 medical subjects are collected with an average token length of 12.77 and high topical diversity.\n\n'medmcqa-rephrased-20k-v0.1' contains 20,000 random samples of MedMCQA questions rephrased into different multiple-choice formats using 'mistralai/Mixtral-8x7B-Instruct-v0.1'.",
"### Dataset Sources\n\n\n\n- Repository: URL\n- Paper: Pal et al. (2022) MedMCQA : A Large-scale Multi-Subject Multi-Choice Dataset for Medical domain Question Answering\n."
] | [
"TAGS\n#license-apache-2.0 #arxiv-2203.14371 #region-us \n",
"# medmcqa-rephrased-20k-v0.1 \n\n(Rephrased Sample) MedMCQA: A Large-scale Multi-Subject Multi-Choice Dataset for Medical domain Question Answering",
"## Dataset Details",
"### Dataset Description\n\nMedMCQA is a large-scale, Multiple-Choice Question Answering (MCQA) dataset designed to address real-world medical entrance exam questions.\nMedMCQA has more than 194k high-quality AIIMS & NEET PG entrance exam MCQs covering 2.4k healthcare topics and 21 medical subjects are collected with an average token length of 12.77 and high topical diversity.\n\n'medmcqa-rephrased-20k-v0.1' contains 20,000 random samples of MedMCQA questions rephrased into different multiple-choice formats using 'mistralai/Mixtral-8x7B-Instruct-v0.1'.",
"### Dataset Sources\n\n\n\n- Repository: URL\n- Paper: Pal et al. (2022) MedMCQA : A Large-scale Multi-Subject Multi-Choice Dataset for Medical domain Question Answering\n."
] |
493ee46f7e1d10804e4c30281fcef17ee63f2295 | ## Segmented ImageNet-1K Subset
A subset of ImageNet-1K that has instance segmentation annotations (classes, boxes, and masks), originally intended for use by the [ViT Prisma Library](https://github.com/soniajoseph/ViT-Prisma).
The annotations were autogenerated by [Grounded Segment Anything](https://github.com/IDEA-Research/Grounded-Segment-Anything).
The total size of the dataset is 12,000 images: 10,000 from ImageNet-1K train and 1,000 each from test and val.
### Organization
Images are organized in the same structure as ImageNet-1K:
```
images/
train_images/
val_images/
test_images/
```
The train and val ImageNet classes can be identified from the filenames. See [imagenet-1k classes](https://huggingface.co/datasets/imagenet-1k/blob/main/classes.py).
Masks are stored in a similar manner:
```
masks/
train_masks/
val_masks/
test_masks/
```
Finally `train.json`, `val.json`, `test.json` store box, label, score and path information:
```json
{
"image": "images/val_images/ILSVRC2012_val_00000025_n01616318.JPEG",
"scores": [0.5, 0.44, 0.43, 0.28],
"boxes": [[149, 117, 400, 347], [2, 2, 498, 497], [148, 115, 401, 349], [2, 2, 498, 497]],
"labels": ["bird", "dirt field", "vulture", "land"],
"masks": ["masks/val_masks/ILSVRC2012_val_00000025_n01616318_00.png", "masks/val_masks/ILSVRC2012_val_00000025_n01616318_01.png", "masks/val_masks/ILSVRC2012_val_00000025_n01616318_02.png", "masks/val_masks/ILSVRC2012_val_00000025_n01616318_03.png"]
}
```
### Citation
Please consider citing this dataset if used in your research:
```bibtex
@misc{segmented_imagenet1k_subset_2024,
author = {ViT-Prisma Contributors},
title = {Segmented ImageNet-1k Subset},
url = {https://huggingface.co/datasets/Prisma-Multimodal/segmented-imagenet1k-subset},
version = {1.0.0},
date = {2024-04-02},
}
```
Grounded Segment Anything and Imagenet can be cited as follows:
```bibtex
@software{grounded_segment_anything,
author = {Grounded-SAM Contributors},
title = {Grounded-Segment-Anything},
url = {https://github.com/IDEA-Research/Grounded-Segment-Anything},
version = {1.2.0},
date = {2023-04-06},
license = {Apache-2.0},
message = {If you use this software, please cite it as below.}
}
```
```bibtex
@article{imagenet15russakovsky,
Author = {Olga Russakovsky and Jia Deng and Hao Su and Jonathan Krause and Sanjeev Satheesh and Sean Ma and Zhiheng Huang and Andrej Karpathy and Aditya Khosla and Michael Bernstein and Alexander C. Berg and Li Fei-Fei},
Title = { {ImageNet Large Scale Visual Recognition Challenge} },
Year = {2015},
journal = {International Journal of Computer Vision (IJCV)},
doi = {10.1007/s11263-015-0816-y},
volume={115},
number={3},
pages={211-252}
}
``` | Prisma-Multimodal/segmented-imagenet1k-subset | [
"region:us"
] | 2024-02-05T01:43:08+00:00 | {} | 2024-02-13T21:06:05+00:00 | [] | [] | TAGS
#region-us
| ## Segmented ImageNet-1K Subset
A subset of ImageNet-1K that has instance segmentation annotations (classes, boxes, and masks), originally intended for use by the ViT Prisma Library.
The annotations were autogenerated by Grounded Segment Anything.
The total size of the dataset is 12,000 images: 10,000 from ImageNet-1K train and 1,000 each from test and val.
### Organization
Images are organized in the same structure as ImageNet-1K:
The train and val ImageNet classes can be identified from the filenames. See imagenet-1k classes.
Masks are stored in a similar manner:
Finally 'URL', 'URL', 'URL' store box, label, score and path information:
Please consider citing this dataset if used in your research:
Grounded Segment Anything and Imagenet can be cited as follows:
| [
"## Segmented ImageNet-1K Subset\n\nA subset of ImageNet-1K that has instance segmentation annotations (classes, boxes, and masks), originally intended for use by the ViT Prisma Library.\n\nThe annotations were autogenerated by Grounded Segment Anything.\n\n\n\nThe total size of the dataset is 12,000 images: 10,000 from ImageNet-1K train and 1,000 each from test and val.",
"### Organization\n\nImages are organized in the same structure as ImageNet-1K:\n\n\n\nThe train and val ImageNet classes can be identified from the filenames. See imagenet-1k classes.\n\nMasks are stored in a similar manner:\n\n\n\nFinally 'URL', 'URL', 'URL' store box, label, score and path information:\n\nPlease consider citing this dataset if used in your research:\n\n\n\nGrounded Segment Anything and Imagenet can be cited as follows:"
] | [
"TAGS\n#region-us \n",
"## Segmented ImageNet-1K Subset\n\nA subset of ImageNet-1K that has instance segmentation annotations (classes, boxes, and masks), originally intended for use by the ViT Prisma Library.\n\nThe annotations were autogenerated by Grounded Segment Anything.\n\n\n\nThe total size of the dataset is 12,000 images: 10,000 from ImageNet-1K train and 1,000 each from test and val.",
"### Organization\n\nImages are organized in the same structure as ImageNet-1K:\n\n\n\nThe train and val ImageNet classes can be identified from the filenames. See imagenet-1k classes.\n\nMasks are stored in a similar manner:\n\n\n\nFinally 'URL', 'URL', 'URL' store box, label, score and path information:\n\nPlease consider citing this dataset if used in your research:\n\n\n\nGrounded Segment Anything and Imagenet can be cited as follows:"
] |
d883f3d341f68065eddd4493fa11f3200a304afc | created a total of 5 images
jlbaker361/dcgan-wikiart1000-resized std: 0.19492517411708832 mean: 4.175252437591553
jlbaker361/dcgan-wikiart1000-clip-resized std: 0.022618815302848816 mean: 4.229665374755859
jlbaker361/dcgan-cond-wikiart1000-resized std: 0.034114643931388855 mean: 4.083883571624756
jlbaker361/dcgan-cond-wikiart1000-clip-resized std: 0.1200457215309143 mean: 4.095186042785644 | jlbaker361/eval-inception-test | [
"region:us"
] | 2024-02-05T02:25:27+00:00 | {} | 2024-02-05T02:25:30+00:00 | [] | [] | TAGS
#region-us
| created a total of 5 images
jlbaker361/dcgan-wikiart1000-resized std: 0.19492517411708832 mean: 4.175252437591553
jlbaker361/dcgan-wikiart1000-clip-resized std: 0.022618815302848816 mean: 4.229665374755859
jlbaker361/dcgan-cond-wikiart1000-resized std: 0.034114643931388855 mean: 4.083883571624756
jlbaker361/dcgan-cond-wikiart1000-clip-resized std: 0.1200457215309143 mean: 4.095186042785644 | [] | [
"TAGS\n#region-us \n"
] |
6dc78a854ebf2075483966a9239406e5c2845b4c | created a total of 5 images
jlbaker361/ddpo-stability-e5 std: 0.12525674700737 mean: 4.365276241302491 inception_mean: 1.0 inception_src: 2.9802322387695312e-08
jlbaker361/ddpo-stability-dcgan-e5 std: 0.1559106409549713 mean: 4.381470108032227 inception_mean: 1.0 inception_src: 5.960464477539063e-08 | jlbaker361/stability-ddpo-inception-test-10 | [
"region:us"
] | 2024-02-05T02:39:51+00:00 | {} | 2024-02-05T03:41:31+00:00 | [] | [] | TAGS
#region-us
| created a total of 5 images
jlbaker361/ddpo-stability-e5 std: 0.12525674700737 mean: 4.365276241302491 inception_mean: 1.0 inception_src: 2.9802322387695312e-08
jlbaker361/ddpo-stability-dcgan-e5 std: 0.1559106409549713 mean: 4.381470108032227 inception_mean: 1.0 inception_src: 5.960464477539063e-08 | [] | [
"TAGS\n#region-us \n"
] |
21cd4314b972fcccae4ab53a9634a4e2d1f6646a | This dataset once scored 4 touchdowns in a single game. | TuringsSolutions/AlBundy500 | [
"license:mit",
"region:us"
] | 2024-02-05T03:06:03+00:00 | {"license": "mit"} | 2024-02-05T03:07:21+00:00 | [] | [] | TAGS
#license-mit #region-us
| This dataset once scored 4 touchdowns in a single game. | [] | [
"TAGS\n#license-mit #region-us \n"
] |
f53fb989a0327e4a7b678889a6f549f092cc469c |
# Indian States GDP numbers
I wanted to find out how Indian states were faring since independence and if government policies and regimes have made any difference. Since I could not find any aggregate data I wanted to use a list of government documents and compile it myself by state.
For the first dataset I will complile only the southern and richer northern states for comparison:
- Andhra Pradesh
- Telangana
- Karnataka
- Kerala
- Maharashtra
- Tamil Nadu
- Gujrat
- Punjab
- West Bengal
## Columns in the dataset
- state: State codes
- start_year: Start of the assessment year
- end_year: End of the assessment year
- value: Value in INR Lacs - 1 Lac is 100000 INR
### Sources:
- 1961 - 1984 : [https://mospi.gov.in/sites/default/files/press_releases_statements/Estimates_of_SDP_1960-61_to_1983-84.pdf](https://mospi.gov.in/sites/default/files/press_releases_statements/Estimates_of_SDP_1960-61_to_1983-84.pdf)
- 1993-94 to 05-06: [https://mospi.gov.in/publication/state-domestic-product-state-series-1993-94](https://mospi.gov.in/publication/state-domestic-product-state-series-1993-94)
- 1999-00 to 09-10: [https://mospi.gov.in/publication/state-domestic-product-state-series-1999-2000](https://mospi.gov.in/publication/state-domestic-product-state-series-1999-2000)
- 2011-12 to 22-23: [https://mospi.gov.in/sites/default/files/press_releases_statements/State_wise_SDP_01_08_2023_Rev.xls](https://mospi.gov.in/sites/default/files/press_releases_statements/State_wise_SDP_01_08_2023_Rev.xls)
## Notes
- Sometimes when there are overlapping timelines like the second source and third source have overlap from 1999-00 to 05-06 - they often have different numbers. In such a case I have considered the calculation done at a later stage considering that is has been updated recently. | neutralboy/indian_states_gdp | [
"size_categories:n<1K",
"language:en",
"license:mit",
"India, Economics, GDP",
"region:us"
] | 2024-02-05T03:38:46+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["n<1K"], "tags": ["India, Economics, GDP"]} | 2024-02-11T07:10:13+00:00 | [] | [
"en"
] | TAGS
#size_categories-n<1K #language-English #license-mit #India, Economics, GDP #region-us
|
# Indian States GDP numbers
I wanted to find out how Indian states were faring since independence and if government policies and regimes have made any difference. Since I could not find any aggregate data I wanted to use a list of government documents and compile it myself by state.
For the first dataset I will complile only the southern and richer northern states for comparison:
- Andhra Pradesh
- Telangana
- Karnataka
- Kerala
- Maharashtra
- Tamil Nadu
- Gujrat
- Punjab
- West Bengal
## Columns in the dataset
- state: State codes
- start_year: Start of the assessment year
- end_year: End of the assessment year
- value: Value in INR Lacs - 1 Lac is 100000 INR
### Sources:
- 1961 - 1984 : URL
- 1993-94 to 05-06: URL
- 1999-00 to 09-10: URL
- 2011-12 to 22-23: URL
## Notes
- Sometimes when there are overlapping timelines like the second source and third source have overlap from 1999-00 to 05-06 - they often have different numbers. In such a case I have considered the calculation done at a later stage considering that is has been updated recently. | [
"# Indian States GDP numbers\n\nI wanted to find out how Indian states were faring since independence and if government policies and regimes have made any difference. Since I could not find any aggregate data I wanted to use a list of government documents and compile it myself by state.\n\nFor the first dataset I will complile only the southern and richer northern states for comparison:\n- Andhra Pradesh\n- Telangana\n- Karnataka\n- Kerala\n- Maharashtra\n- Tamil Nadu\n- Gujrat\n- Punjab\n- West Bengal",
"## Columns in the dataset\n- state: State codes\n- start_year: Start of the assessment year\n- end_year: End of the assessment year\n- value: Value in INR Lacs - 1 Lac is 100000 INR",
"### Sources:\n- 1961 - 1984 : URL\n- 1993-94 to 05-06: URL\n- 1999-00 to 09-10: URL\n- 2011-12 to 22-23: URL",
"## Notes\n- Sometimes when there are overlapping timelines like the second source and third source have overlap from 1999-00 to 05-06 - they often have different numbers. In such a case I have considered the calculation done at a later stage considering that is has been updated recently."
] | [
"TAGS\n#size_categories-n<1K #language-English #license-mit #India, Economics, GDP #region-us \n",
"# Indian States GDP numbers\n\nI wanted to find out how Indian states were faring since independence and if government policies and regimes have made any difference. Since I could not find any aggregate data I wanted to use a list of government documents and compile it myself by state.\n\nFor the first dataset I will complile only the southern and richer northern states for comparison:\n- Andhra Pradesh\n- Telangana\n- Karnataka\n- Kerala\n- Maharashtra\n- Tamil Nadu\n- Gujrat\n- Punjab\n- West Bengal",
"## Columns in the dataset\n- state: State codes\n- start_year: Start of the assessment year\n- end_year: End of the assessment year\n- value: Value in INR Lacs - 1 Lac is 100000 INR",
"### Sources:\n- 1961 - 1984 : URL\n- 1993-94 to 05-06: URL\n- 1999-00 to 09-10: URL\n- 2011-12 to 22-23: URL",
"## Notes\n- Sometimes when there are overlapping timelines like the second source and third source have overlap from 1999-00 to 05-06 - they often have different numbers. In such a case I have considered the calculation done at a later stage considering that is has been updated recently."
] |
2ca0894fd98bf413c16f7a63e366ecd565701345 |
# Dataset Card for Evaluation run of alnrg2arg/blockchainlabs_joe_bez_seminar
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alnrg2arg/blockchainlabs_joe_bez_seminar](https://huggingface.co/alnrg2arg/blockchainlabs_joe_bez_seminar) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alnrg2arg__blockchainlabs_joe_bez_seminar",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-05T07:09:56.959755](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__blockchainlabs_joe_bez_seminar/blob/main/results_2024-02-05T07-09-56.959755.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6573713873345745,
"acc_stderr": 0.031962879207777545,
"acc_norm": 0.6566597701932584,
"acc_norm_stderr": 0.03263475444973871,
"mc1": 0.5752753977968176,
"mc1_stderr": 0.017304000957167474,
"mc2": 0.718621868768424,
"mc2_stderr": 0.014725902717803536
},
"harness|arc:challenge|25": {
"acc": 0.7133105802047781,
"acc_stderr": 0.013214986329274774,
"acc_norm": 0.7380546075085325,
"acc_norm_stderr": 0.012849054826858107
},
"harness|hellaswag|10": {
"acc": 0.7127066321449911,
"acc_stderr": 0.004515748192605717,
"acc_norm": 0.8871738697470624,
"acc_norm_stderr": 0.0031573355082588493
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.02289168798455496,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.02289168798455496
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.02925290592725197,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.02925290592725197
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179326,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179326
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.013547415658662257,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.013547415658662257
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.44692737430167595,
"acc_stderr": 0.01662803003964761,
"acc_norm": 0.44692737430167595,
"acc_norm_stderr": 0.01662803003964761
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.02521804037341063,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.02521804037341063
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.01274197433389723,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.01274197433389723
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.0286619962023353,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.0286619962023353
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.01897542792050721,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.01897542792050721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5752753977968176,
"mc1_stderr": 0.017304000957167474,
"mc2": 0.718621868768424,
"mc2_stderr": 0.014725902717803536
},
"harness|winogrande|5": {
"acc": 0.8516179952644041,
"acc_stderr": 0.009990706005184136
},
"harness|gsm8k|5": {
"acc": 0.7043214556482184,
"acc_stderr": 0.012570068947898768
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_alnrg2arg__blockchainlabs_joe_bez_seminar | [
"region:us"
] | 2024-02-05T03:57:33+00:00 | {"pretty_name": "Evaluation run of alnrg2arg/blockchainlabs_joe_bez_seminar", "dataset_summary": "Dataset automatically created during the evaluation run of model [alnrg2arg/blockchainlabs_joe_bez_seminar](https://huggingface.co/alnrg2arg/blockchainlabs_joe_bez_seminar) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alnrg2arg__blockchainlabs_joe_bez_seminar\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-05T07:09:56.959755](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__blockchainlabs_joe_bez_seminar/blob/main/results_2024-02-05T07-09-56.959755.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6573713873345745,\n \"acc_stderr\": 0.031962879207777545,\n \"acc_norm\": 0.6566597701932584,\n \"acc_norm_stderr\": 0.03263475444973871,\n \"mc1\": 0.5752753977968176,\n \"mc1_stderr\": 0.017304000957167474,\n \"mc2\": 0.718621868768424,\n \"mc2_stderr\": 0.014725902717803536\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7133105802047781,\n \"acc_stderr\": 0.013214986329274774,\n \"acc_norm\": 0.7380546075085325,\n \"acc_norm_stderr\": 0.012849054826858107\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7127066321449911,\n \"acc_stderr\": 0.004515748192605717,\n \"acc_norm\": 0.8871738697470624,\n \"acc_norm_stderr\": 0.0031573355082588493\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.02289168798455496,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.02289168798455496\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.02925290592725197,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.02925290592725197\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.013547415658662257,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.013547415658662257\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44692737430167595,\n \"acc_stderr\": 0.01662803003964761,\n \"acc_norm\": 0.44692737430167595,\n \"acc_norm_stderr\": 0.01662803003964761\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n \"acc_stderr\": 0.02521804037341063,\n \"acc_norm\": 0.729903536977492,\n \"acc_norm_stderr\": 0.02521804037341063\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.01274197433389723,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.01274197433389723\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.0286619962023353,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.0286619962023353\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5752753977968176,\n \"mc1_stderr\": 0.017304000957167474,\n \"mc2\": 0.718621868768424,\n \"mc2_stderr\": 0.014725902717803536\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8516179952644041,\n \"acc_stderr\": 0.009990706005184136\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7043214556482184,\n \"acc_stderr\": 0.012570068947898768\n }\n}\n```", "repo_url": "https://huggingface.co/alnrg2arg/blockchainlabs_joe_bez_seminar", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|arc:challenge|25_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|arc:challenge|25_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|gsm8k|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|gsm8k|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hellaswag|10_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hellaswag|10_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T03-55-13.130277.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T07-09-56.959755.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["**/details_harness|winogrande|5_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["**/details_harness|winogrande|5_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-05T07-09-56.959755.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_05T03_55_13.130277", "path": ["results_2024-02-05T03-55-13.130277.parquet"]}, {"split": "2024_02_05T07_09_56.959755", "path": ["results_2024-02-05T07-09-56.959755.parquet"]}, {"split": "latest", "path": ["results_2024-02-05T07-09-56.959755.parquet"]}]}]} | 2024-02-05T07:12:39+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of alnrg2arg/blockchainlabs_joe_bez_seminar
Dataset automatically created during the evaluation run of model alnrg2arg/blockchainlabs_joe_bez_seminar on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-05T07:09:56.959755(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of alnrg2arg/blockchainlabs_joe_bez_seminar\n\n\n\nDataset automatically created during the evaluation run of model alnrg2arg/blockchainlabs_joe_bez_seminar on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T07:09:56.959755(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of alnrg2arg/blockchainlabs_joe_bez_seminar\n\n\n\nDataset automatically created during the evaluation run of model alnrg2arg/blockchainlabs_joe_bez_seminar on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T07:09:56.959755(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
da07dfb0d8905cb582cd3bc6e93ad99ffa7a594f |
# Dataset Card for Evaluation run of adonlee/Mistral_7B_SFT_DPO_v0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [adonlee/Mistral_7B_SFT_DPO_v0](https://huggingface.co/adonlee/Mistral_7B_SFT_DPO_v0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_adonlee__Mistral_7B_SFT_DPO_v0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-05T04:12:04.142911](https://huggingface.co/datasets/open-llm-leaderboard/details_adonlee__Mistral_7B_SFT_DPO_v0/blob/main/results_2024-02-05T04-12-04.142911.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6481499210250342,
"acc_stderr": 0.032001744813704644,
"acc_norm": 0.6490978191604729,
"acc_norm_stderr": 0.0326549204033805,
"mc1": 0.5520195838433293,
"mc1_stderr": 0.01740851306342291,
"mc2": 0.697182404558769,
"mc2_stderr": 0.015034572567275492
},
"harness|arc:challenge|25": {
"acc": 0.6168941979522184,
"acc_stderr": 0.014206472661672876,
"acc_norm": 0.6629692832764505,
"acc_norm_stderr": 0.013813476652902272
},
"harness|hellaswag|10": {
"acc": 0.6597291376219877,
"acc_stderr": 0.004728318577835212,
"acc_norm": 0.8490340569607648,
"acc_norm_stderr": 0.0035728399695219883
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742399,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952929,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952929
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.03252909619613197,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.03252909619613197
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997692,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997692
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.02366421667164251,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.02366421667164251
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.03123475237772117,
"acc_norm": 0.8,
"acc_norm_stderr": 0.03123475237772117
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948492,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848036,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848036
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8270042194092827,
"acc_stderr": 0.024621562866768438,
"acc_norm": 0.8270042194092827,
"acc_norm_stderr": 0.024621562866768438
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.01377869377846408,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.01377869377846408
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3318435754189944,
"acc_stderr": 0.015748421208187306,
"acc_norm": 0.3318435754189944,
"acc_norm_stderr": 0.015748421208187306
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826528,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826528
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.025218040373410622,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.025218040373410622
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4876140808344198,
"acc_stderr": 0.01276631731547356,
"acc_norm": 0.4876140808344198,
"acc_norm_stderr": 0.01276631731547356
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031204,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031204
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.02372983088101853,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.02372983088101853
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5520195838433293,
"mc1_stderr": 0.01740851306342291,
"mc2": 0.697182404558769,
"mc2_stderr": 0.015034572567275492
},
"harness|winogrande|5": {
"acc": 0.8176795580110497,
"acc_stderr": 0.010851565594267202
},
"harness|gsm8k|5": {
"acc": 0.6580742987111448,
"acc_stderr": 0.013066089625182808
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_adonlee__Mistral_7B_SFT_DPO_v0 | [
"region:us"
] | 2024-02-05T04:14:22+00:00 | {"pretty_name": "Evaluation run of adonlee/Mistral_7B_SFT_DPO_v0", "dataset_summary": "Dataset automatically created during the evaluation run of model [adonlee/Mistral_7B_SFT_DPO_v0](https://huggingface.co/adonlee/Mistral_7B_SFT_DPO_v0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_adonlee__Mistral_7B_SFT_DPO_v0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-05T04:12:04.142911](https://huggingface.co/datasets/open-llm-leaderboard/details_adonlee__Mistral_7B_SFT_DPO_v0/blob/main/results_2024-02-05T04-12-04.142911.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6481499210250342,\n \"acc_stderr\": 0.032001744813704644,\n \"acc_norm\": 0.6490978191604729,\n \"acc_norm_stderr\": 0.0326549204033805,\n \"mc1\": 0.5520195838433293,\n \"mc1_stderr\": 0.01740851306342291,\n \"mc2\": 0.697182404558769,\n \"mc2_stderr\": 0.015034572567275492\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6168941979522184,\n \"acc_stderr\": 0.014206472661672876,\n \"acc_norm\": 0.6629692832764505,\n \"acc_norm_stderr\": 0.013813476652902272\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6597291376219877,\n \"acc_stderr\": 0.004728318577835212,\n \"acc_norm\": 0.8490340569607648,\n \"acc_norm_stderr\": 0.0035728399695219883\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742399,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742399\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952929,\n \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952929\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.03252909619613197,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.03252909619613197\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.02366421667164251,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.02366421667164251\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.03123475237772117,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.03123475237772117\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948492,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948492\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848036,\n \"acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848036\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8270042194092827,\n \"acc_stderr\": 0.024621562866768438,\n \"acc_norm\": 0.8270042194092827,\n \"acc_norm_stderr\": 0.024621562866768438\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n \"acc_stderr\": 0.01377869377846408,\n \"acc_norm\": 0.8186462324393359,\n \"acc_norm_stderr\": 0.01377869377846408\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3318435754189944,\n \"acc_stderr\": 0.015748421208187306,\n \"acc_norm\": 0.3318435754189944,\n \"acc_norm_stderr\": 0.015748421208187306\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826528,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826528\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n \"acc_stderr\": 0.025218040373410622,\n \"acc_norm\": 0.729903536977492,\n \"acc_norm_stderr\": 0.025218040373410622\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4876140808344198,\n \"acc_stderr\": 0.01276631731547356,\n \"acc_norm\": 0.4876140808344198,\n \"acc_norm_stderr\": 0.01276631731547356\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031204,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031204\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n \"acc_stderr\": 0.02372983088101853,\n \"acc_norm\": 0.8706467661691543,\n \"acc_norm_stderr\": 0.02372983088101853\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5520195838433293,\n \"mc1_stderr\": 0.01740851306342291,\n \"mc2\": 0.697182404558769,\n \"mc2_stderr\": 0.015034572567275492\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8176795580110497,\n \"acc_stderr\": 0.010851565594267202\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6580742987111448,\n \"acc_stderr\": 0.013066089625182808\n }\n}\n```", "repo_url": "https://huggingface.co/adonlee/Mistral_7B_SFT_DPO_v0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|arc:challenge|25_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|gsm8k|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hellaswag|10_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T04-12-04.142911.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["**/details_harness|winogrande|5_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-05T04-12-04.142911.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_05T04_12_04.142911", "path": ["results_2024-02-05T04-12-04.142911.parquet"]}, {"split": "latest", "path": ["results_2024-02-05T04-12-04.142911.parquet"]}]}]} | 2024-02-05T04:14:49+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of adonlee/Mistral_7B_SFT_DPO_v0
Dataset automatically created during the evaluation run of model adonlee/Mistral_7B_SFT_DPO_v0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-05T04:12:04.142911(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of adonlee/Mistral_7B_SFT_DPO_v0\n\n\n\nDataset automatically created during the evaluation run of model adonlee/Mistral_7B_SFT_DPO_v0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T04:12:04.142911(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of adonlee/Mistral_7B_SFT_DPO_v0\n\n\n\nDataset automatically created during the evaluation run of model adonlee/Mistral_7B_SFT_DPO_v0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T04:12:04.142911(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
a0b9f76b3b62b09e68d8221cb09662ad584409c2 |
This dataset was create semi-synthetically using a RAG system containing crop nutrition and environmental conditions requirements for various plants, sourced from agricultural college data, along with open nutrient projects data, connected to a ChatGPT4 API, put together by Copyleft Cultivars Nonprofit, then cleaned lightly by Caleb DeLeeuw.
The dataset is in json. | CopyleftCultivars/SemiSynthetic_Locally_Growing_Plants_by_Region | [
"license:other",
"region:us"
] | 2024-02-05T04:20:11+00:00 | {"license": "other", "license_name": "hl3-cl-eco-extr", "license_link": "https://firstdonoharm.dev/version/3/0/cl-eco-extr.html"} | 2024-02-08T06:43:10+00:00 | [] | [] | TAGS
#license-other #region-us
|
This dataset was create semi-synthetically using a RAG system containing crop nutrition and environmental conditions requirements for various plants, sourced from agricultural college data, along with open nutrient projects data, connected to a ChatGPT4 API, put together by Copyleft Cultivars Nonprofit, then cleaned lightly by Caleb DeLeeuw.
The dataset is in json. | [] | [
"TAGS\n#license-other #region-us \n"
] |
07a737f1ae55ec53efc2ac9fcf45f45658079341 | # Dataset Card for "alpaca_farm-reward-model-deberta-v3-large-v2-re-eval-preference"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Mitsuki-Sakamoto/alpaca_farm-reward-model-deberta-v3-large-v2-re-eval-preference | [
"region:us"
] | 2024-02-05T04:29:38+00:00 | {"dataset_info": {"config_name": "alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500", "features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "preference", "dtype": "int64"}, {"name": "output_1", "dtype": "string"}, {"name": "output_2", "dtype": "string"}, {"name": "reward_model_prompt_format", "dtype": "string"}, {"name": "gen_prompt_format", "dtype": "string"}, {"name": "gen_kwargs", "struct": [{"name": "do_sample", "dtype": "bool"}, {"name": "max_new_tokens", "dtype": "int64"}, {"name": "pad_token_id", "dtype": "int64"}, {"name": "top_k", "dtype": "int64"}, {"name": "top_p", "dtype": "float64"}]}, {"name": "reward_1", "dtype": "float64"}, {"name": "reward_2", "dtype": "float64"}], "splits": [{"name": "val", "num_bytes": 2575959, "num_examples": 2000}], "download_size": 1232867, "dataset_size": 2575959}, "configs": [{"config_name": "alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500", "data_files": [{"split": "val", "path": "alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/val-*"}]}]} | 2024-02-05T04:29:42+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "alpaca_farm-reward-model-deberta-v3-large-v2-re-eval-preference"
More Information needed | [
"# Dataset Card for \"alpaca_farm-reward-model-deberta-v3-large-v2-re-eval-preference\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"alpaca_farm-reward-model-deberta-v3-large-v2-re-eval-preference\"\n\nMore Information needed"
] |
b1754fb61603ffef50f039a863f9ae30d4ed9b64 |
# Dataset Card for Evaluation run of ZoidBB/Kory-0.1-11b-pre1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ZoidBB/Kory-0.1-11b-pre1](https://huggingface.co/ZoidBB/Kory-0.1-11b-pre1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ZoidBB__Kory-0.1-11b-pre1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-05T04:36:04.521362](https://huggingface.co/datasets/open-llm-leaderboard/details_ZoidBB__Kory-0.1-11b-pre1/blob/main/results_2024-02-05T04-36-04.521362.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6505323383754075,
"acc_stderr": 0.032253198164797686,
"acc_norm": 0.6512201411073144,
"acc_norm_stderr": 0.03292314992355177,
"mc1": 0.5214198286413708,
"mc1_stderr": 0.01748743214471164,
"mc2": 0.6768329789105285,
"mc2_stderr": 0.015242959246212948
},
"harness|arc:challenge|25": {
"acc": 0.7090443686006825,
"acc_stderr": 0.013273077865907588,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545803
},
"harness|hellaswag|10": {
"acc": 0.6955785700059749,
"acc_stderr": 0.004592215118295278,
"acc_norm": 0.879008165704043,
"acc_norm_stderr": 0.0032545129328064
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.025506481698138208,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.025506481698138208
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.02950286112895529,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.02950286112895529
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590172,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590172
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464074,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464074
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.44581005586592176,
"acc_stderr": 0.016623998513333106,
"acc_norm": 0.44581005586592176,
"acc_norm_stderr": 0.016623998513333106
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.02465968518596728,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.02465968518596728
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.012734923579532069,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.012734923579532069
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495144,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495144
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304324,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304324
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5214198286413708,
"mc1_stderr": 0.01748743214471164,
"mc2": 0.6768329789105285,
"mc2_stderr": 0.015242959246212948
},
"harness|winogrande|5": {
"acc": 0.8539857932123125,
"acc_stderr": 0.009924440374585243
},
"harness|gsm8k|5": {
"acc": 0.6095526914329037,
"acc_stderr": 0.01343782986466858
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ZoidBB__Kory-0.1-11b-pre1 | [
"region:us"
] | 2024-02-05T04:38:19+00:00 | {"pretty_name": "Evaluation run of ZoidBB/Kory-0.1-11b-pre1", "dataset_summary": "Dataset automatically created during the evaluation run of model [ZoidBB/Kory-0.1-11b-pre1](https://huggingface.co/ZoidBB/Kory-0.1-11b-pre1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ZoidBB__Kory-0.1-11b-pre1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-05T04:36:04.521362](https://huggingface.co/datasets/open-llm-leaderboard/details_ZoidBB__Kory-0.1-11b-pre1/blob/main/results_2024-02-05T04-36-04.521362.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6505323383754075,\n \"acc_stderr\": 0.032253198164797686,\n \"acc_norm\": 0.6512201411073144,\n \"acc_norm_stderr\": 0.03292314992355177,\n \"mc1\": 0.5214198286413708,\n \"mc1_stderr\": 0.01748743214471164,\n \"mc2\": 0.6768329789105285,\n \"mc2_stderr\": 0.015242959246212948\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7090443686006825,\n \"acc_stderr\": 0.013273077865907588,\n \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545803\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6955785700059749,\n \"acc_stderr\": 0.004592215118295278,\n \"acc_norm\": 0.879008165704043,\n \"acc_norm_stderr\": 0.0032545129328064\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138208,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138208\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37407407407407406,\n \"acc_stderr\": 0.02950286112895529,\n \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.02950286112895529\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590172,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590172\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n \"acc_stderr\": 0.013778693778464074,\n \"acc_norm\": 0.8186462324393359,\n \"acc_norm_stderr\": 0.013778693778464074\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44581005586592176,\n \"acc_stderr\": 0.016623998513333106,\n \"acc_norm\": 0.44581005586592176,\n \"acc_norm_stderr\": 0.016623998513333106\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.02465968518596728,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.02465968518596728\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n \"acc_stderr\": 0.012734923579532069,\n \"acc_norm\": 0.46284224250325945,\n \"acc_norm_stderr\": 0.012734923579532069\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495144,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495144\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304324,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304324\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5214198286413708,\n \"mc1_stderr\": 0.01748743214471164,\n \"mc2\": 0.6768329789105285,\n \"mc2_stderr\": 0.015242959246212948\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8539857932123125,\n \"acc_stderr\": 0.009924440374585243\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6095526914329037,\n \"acc_stderr\": 0.01343782986466858\n }\n}\n```", "repo_url": "https://huggingface.co/ZoidBB/Kory-0.1-11b-pre1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|arc:challenge|25_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|gsm8k|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hellaswag|10_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T04-36-04.521362.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["**/details_harness|winogrande|5_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-05T04-36-04.521362.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_05T04_36_04.521362", "path": ["results_2024-02-05T04-36-04.521362.parquet"]}, {"split": "latest", "path": ["results_2024-02-05T04-36-04.521362.parquet"]}]}]} | 2024-02-05T04:38:41+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ZoidBB/Kory-0.1-11b-pre1
Dataset automatically created during the evaluation run of model ZoidBB/Kory-0.1-11b-pre1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-05T04:36:04.521362(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ZoidBB/Kory-0.1-11b-pre1\n\n\n\nDataset automatically created during the evaluation run of model ZoidBB/Kory-0.1-11b-pre1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T04:36:04.521362(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ZoidBB/Kory-0.1-11b-pre1\n\n\n\nDataset automatically created during the evaluation run of model ZoidBB/Kory-0.1-11b-pre1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T04:36:04.521362(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
a8fa6810ef42a63a18cefa15f215d99629d02374 |
# Dataset Card for Evaluation run of argilla/CapybaraHermes-2.5-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [argilla/CapybaraHermes-2.5-Mistral-7B](https://huggingface.co/argilla/CapybaraHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_argilla__CapybaraHermes-2.5-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-05T04:37:41.932157](https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__CapybaraHermes-2.5-Mistral-7B/blob/main/results_2024-02-05T04-37-41.932157.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6337927509624036,
"acc_stderr": 0.03218425697687337,
"acc_norm": 0.6355188446806315,
"acc_norm_stderr": 0.032825707370373276,
"mc1": 0.40269277845777235,
"mc1_stderr": 0.01716883093518722,
"mc2": 0.5691194812882566,
"mc2_stderr": 0.015444382729777764
},
"harness|arc:challenge|25": {
"acc": 0.6313993174061433,
"acc_stderr": 0.014097810678042196,
"acc_norm": 0.6578498293515358,
"acc_norm_stderr": 0.013864152159177278
},
"harness|hellaswag|10": {
"acc": 0.6708822943636725,
"acc_stderr": 0.004689324696186879,
"acc_norm": 0.854511053574985,
"acc_norm_stderr": 0.0035187252573655988
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642507,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642507
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878934,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878934
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501534,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501534
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290902,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290902
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.0306365913486998,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.0306365913486998
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834836,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834836
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32625698324022345,
"acc_stderr": 0.015680441518889178,
"acc_norm": 0.32625698324022345,
"acc_norm_stderr": 0.015680441518889178
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.02505850331695814,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.02505850331695814
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079077,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079077
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960227,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960227
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40269277845777235,
"mc1_stderr": 0.01716883093518722,
"mc2": 0.5691194812882566,
"mc2_stderr": 0.015444382729777764
},
"harness|winogrande|5": {
"acc": 0.7829518547750592,
"acc_stderr": 0.011585871710209411
},
"harness|gsm8k|5": {
"acc": 0.5928733889310084,
"acc_stderr": 0.013532811069356528
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_argilla__CapybaraHermes-2.5-Mistral-7B | [
"region:us"
] | 2024-02-05T04:40:03+00:00 | {"pretty_name": "Evaluation run of argilla/CapybaraHermes-2.5-Mistral-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [argilla/CapybaraHermes-2.5-Mistral-7B](https://huggingface.co/argilla/CapybaraHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_argilla__CapybaraHermes-2.5-Mistral-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-05T04:37:41.932157](https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__CapybaraHermes-2.5-Mistral-7B/blob/main/results_2024-02-05T04-37-41.932157.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6337927509624036,\n \"acc_stderr\": 0.03218425697687337,\n \"acc_norm\": 0.6355188446806315,\n \"acc_norm_stderr\": 0.032825707370373276,\n \"mc1\": 0.40269277845777235,\n \"mc1_stderr\": 0.01716883093518722,\n \"mc2\": 0.5691194812882566,\n \"mc2_stderr\": 0.015444382729777764\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6313993174061433,\n \"acc_stderr\": 0.014097810678042196,\n \"acc_norm\": 0.6578498293515358,\n \"acc_norm_stderr\": 0.013864152159177278\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6708822943636725,\n \"acc_stderr\": 0.004689324696186879,\n \"acc_norm\": 0.854511053574985,\n \"acc_norm_stderr\": 0.0035187252573655988\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642507,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642507\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878934,\n \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878934\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501534,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501534\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.0306365913486998,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.0306365913486998\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834836,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834836\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32625698324022345,\n \"acc_stderr\": 0.015680441518889178,\n \"acc_norm\": 0.32625698324022345,\n \"acc_norm_stderr\": 0.015680441518889178\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695814,\n \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695814\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035457,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035457\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n \"acc_stderr\": 0.012747248967079077,\n \"acc_norm\": 0.470013037809648,\n \"acc_norm_stderr\": 0.012747248967079077\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960227,\n \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960227\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40269277845777235,\n \"mc1_stderr\": 0.01716883093518722,\n \"mc2\": 0.5691194812882566,\n \"mc2_stderr\": 0.015444382729777764\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7829518547750592,\n \"acc_stderr\": 0.011585871710209411\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5928733889310084,\n \"acc_stderr\": 0.013532811069356528\n }\n}\n```", "repo_url": "https://huggingface.co/argilla/CapybaraHermes-2.5-Mistral-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|arc:challenge|25_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|gsm8k|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hellaswag|10_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T04-37-41.932157.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["**/details_harness|winogrande|5_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-05T04-37-41.932157.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_05T04_37_41.932157", "path": ["results_2024-02-05T04-37-41.932157.parquet"]}, {"split": "latest", "path": ["results_2024-02-05T04-37-41.932157.parquet"]}]}]} | 2024-02-05T04:40:30+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of argilla/CapybaraHermes-2.5-Mistral-7B
Dataset automatically created during the evaluation run of model argilla/CapybaraHermes-2.5-Mistral-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-05T04:37:41.932157(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of argilla/CapybaraHermes-2.5-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model argilla/CapybaraHermes-2.5-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T04:37:41.932157(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of argilla/CapybaraHermes-2.5-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model argilla/CapybaraHermes-2.5-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T04:37:41.932157(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
34aa95f170786fd6bf059db7d77a472863db150b |
# Sadeem QnA: An Arabic QnA Dataset 🌍✨
Welcome to the **Sadeem QnA** dataset, a vibrant collection designed for the advancement of Arabic natural language processing, specifically tailored for Question Answering (QnA) systems. Sourced from the rich and diverse content of Arabic Wikipedia, this dataset is a gateway to exploring the depths of Arabic language understanding, offering a unique challenge to both researchers and AI enthusiasts alike.
## Table of Contents
- [About Sadeem QnA](#about-sadeem-qna)
- [Dataset Structure](#dataset-structure)
- [Getting Started](#getting-started)
- [Usage](#usage)
- [Contributing](#contributing)
- [License](#license)
- [Citation](#citation)
## About Sadeem QnA
The **Sadeem QnA** dataset is crafted with the intent to foster research and development in Arabic Question Answering systems. It encompasses a broad range of topics, reflecting the rich tapestry of Arabic culture, history, and science, making it an ideal resource for training and evaluating AI models.
### Why Sadeem QnA?
- **Rich Content:** Over 6,000 QnA pairs across diverse subjects.
- **Real-World Questions:** Derived from actual queries people might ask, providing practical value for real-world applications.
- **Dual Splits:** Carefully partitioned into training (5,000 rows) and testing (1,030 rows) sets to facilitate effective model evaluation.
## Dataset Structure
Each record in the dataset follows a structured format, containing the following fields:
- `title`: The title of the Wikipedia article.
- `text`: A snippet from the article related to the question.
- `source`: The URL of the Wikipedia page.
- `question`: A question related to the text snippet.
- `answer`: The answer to the question.
- `has_answer`: A boolean indicating whether the answer is present in the text snippet.
### Example Record
```json
{
'title': 'قائمة الجوائز والترشيحات التي تلقتها سلسلة أفلام مباريات الجوع',
'text': 'قائمة الجوائز والترشيحات التي تلقتها سلسلة أفلام مباريات الجوع قائمة تُسجّل الترشيحات والجوائز التي تلقتها سلسلة أفلام مباريات الجوع المقتبسة من سلسلة مباريات الجوع للمؤلفة الأمريكية سوزان كولنز. والسلسلة من توزيع شركة ليونزغيت إنترتاينمنت، وقام ببطولتها جينيفر لورنس في دور كاتنيس إيفردين، جوش هوتشرسن في دور بيتا ميلاريك. وبدأت السلسلة بفيلم مباريات الجوع الذي صدر في العام 2012، ثم فيلم في العام 2013، وتبعهما كل من (2014) وأخيرًا: (2015). كان لجينيفر لورنس حصة الأسد في سجل الترشيحات والجوائز التي نالتها السلسلة.',
'source': 'https://ar.wikipedia.org/wiki?curid=6237097',
'question': 'متى صدر الفيلم الأول من سلسلة مباريات الجوع؟',
'answer': 'عام 2012',
'has_answer': True
},
{
'title': 'سانت فرنسيس (ويسكونسن)',
'text': 'بلغ عدد الأسر 4,494 أسرة كانت نسبة 19.8% منها لديها أطفال تحت سن الثامنة عشر تعيش معهم، وبلغت نسبة الأزواج القاطنين مع بعضهم البعض 36.6% من أصل المجموع الكلي للأسر، ونسبة 8.7% من الأسر كان لديها معيلات من الإناث دون وجود شريك، بينما كانت نسبة 3.9% من الأسر لديها معيلون من الذكور دون وجود شريكة وكانت نسبة 50.8% من غير العائلات. تألفت نسبة 42.6% من أصل جميع الأسر من أفراد ونسبة 13.7% كانوا يعيش معهم شخص وحيد يبلغ من العمر 65 عاماً فما فوق. وبلغ متوسط حجم الأسرة المعيشية 2.80، أما متوسط حجم العائلات فبلغ 2.02.',
'source': 'https://ar.wikipedia.org/wiki?curid=2198358',
'question': 'ما هو عدد العائلات المقيمة في سانت فرنسيس؟',
'answer': '',
'has_answer': False
}
```
## Getting Started
To get started with the **Sadeem QnA** dataset, you can download it directly from our [Huggingface repository](https://huggingface.co/datasets/sadeem-ai/arabic-qna).
Follow the instructions there to load the dataset into your environment and begin exploring.
## Usage
This dataset is perfect for:
- Training machine learning models for Arabic question answering.
- Evaluating the performance of NLP models on Arabic text.
- Enhancing language understanding systems with a focus on Arabic.
## Contributing
We welcome contributions from the community! Whether it's improving the documentation, adding more questions, or reporting issues, your help makes **Sadeem QnA** better for everyone.
## License
The **Sadeem QnA** dataset is available under the Apache License 2.0. We encourage its use for academic research, commercial applications, and beyond, provided proper attribution is given.
## Citation
If you use the **Sadeem QnA** dataset in your research, please cite it using the following format:
```bibtex
@misc{sadeem_qna,
title={Sadeem QnA: An Arabic QnA Dataset},
author={},
year={2024},
publisher={Huggingface},
howpublished={\url{https://huggingface.co/datasets/sadeem-ai/arabic-qna}},
}
```
Embark on your journey through the Arabic language with **Sadeem QnA** and unlock the potential of AI in understanding the complexity and beauty of Arabic text. 🚀💡
| sadeem-ai/arabic-qna | [
"task_categories:question-answering",
"size_categories:1K<n<10K",
"language:ar",
"license:apache-2.0",
"qna",
"questioning-answering",
"questions-generation",
"region:us"
] | 2024-02-05T04:45:47+00:00 | {"language": ["ar"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "task_categories": ["question-answering"], "pretty_name": "arabic QnA dataset", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "ar-qna-train-data-hf.csv"}, {"split": "test", "path": "ar-qna-test-data-hf.csv"}]}], "tags": ["qna", "questioning-answering", "questions-generation"]} | 2024-02-05T13:38:36+00:00 | [] | [
"ar"
] | TAGS
#task_categories-question-answering #size_categories-1K<n<10K #language-Arabic #license-apache-2.0 #qna #questioning-answering #questions-generation #region-us
|
# Sadeem QnA: An Arabic QnA Dataset
Welcome to the Sadeem QnA dataset, a vibrant collection designed for the advancement of Arabic natural language processing, specifically tailored for Question Answering (QnA) systems. Sourced from the rich and diverse content of Arabic Wikipedia, this dataset is a gateway to exploring the depths of Arabic language understanding, offering a unique challenge to both researchers and AI enthusiasts alike.
## Table of Contents
- About Sadeem QnA
- Dataset Structure
- Getting Started
- Usage
- Contributing
- License
- Citation
## About Sadeem QnA
The Sadeem QnA dataset is crafted with the intent to foster research and development in Arabic Question Answering systems. It encompasses a broad range of topics, reflecting the rich tapestry of Arabic culture, history, and science, making it an ideal resource for training and evaluating AI models.
### Why Sadeem QnA?
- Rich Content: Over 6,000 QnA pairs across diverse subjects.
- Real-World Questions: Derived from actual queries people might ask, providing practical value for real-world applications.
- Dual Splits: Carefully partitioned into training (5,000 rows) and testing (1,030 rows) sets to facilitate effective model evaluation.
## Dataset Structure
Each record in the dataset follows a structured format, containing the following fields:
- 'title': The title of the Wikipedia article.
- 'text': A snippet from the article related to the question.
- 'source': The URL of the Wikipedia page.
- 'question': A question related to the text snippet.
- 'answer': The answer to the question.
- 'has_answer': A boolean indicating whether the answer is present in the text snippet.
### Example Record
## Getting Started
To get started with the Sadeem QnA dataset, you can download it directly from our Huggingface repository.
Follow the instructions there to load the dataset into your environment and begin exploring.
## Usage
This dataset is perfect for:
- Training machine learning models for Arabic question answering.
- Evaluating the performance of NLP models on Arabic text.
- Enhancing language understanding systems with a focus on Arabic.
## Contributing
We welcome contributions from the community! Whether it's improving the documentation, adding more questions, or reporting issues, your help makes Sadeem QnA better for everyone.
## License
The Sadeem QnA dataset is available under the Apache License 2.0. We encourage its use for academic research, commercial applications, and beyond, provided proper attribution is given.
If you use the Sadeem QnA dataset in your research, please cite it using the following format:
Embark on your journey through the Arabic language with Sadeem QnA and unlock the potential of AI in understanding the complexity and beauty of Arabic text.
| [
"# Sadeem QnA: An Arabic QnA Dataset \n\nWelcome to the Sadeem QnA dataset, a vibrant collection designed for the advancement of Arabic natural language processing, specifically tailored for Question Answering (QnA) systems. Sourced from the rich and diverse content of Arabic Wikipedia, this dataset is a gateway to exploring the depths of Arabic language understanding, offering a unique challenge to both researchers and AI enthusiasts alike.",
"## Table of Contents\n\n- About Sadeem QnA\n- Dataset Structure\n- Getting Started\n- Usage\n- Contributing\n- License\n- Citation",
"## About Sadeem QnA\n\nThe Sadeem QnA dataset is crafted with the intent to foster research and development in Arabic Question Answering systems. It encompasses a broad range of topics, reflecting the rich tapestry of Arabic culture, history, and science, making it an ideal resource for training and evaluating AI models.",
"### Why Sadeem QnA?\n\n- Rich Content: Over 6,000 QnA pairs across diverse subjects.\n- Real-World Questions: Derived from actual queries people might ask, providing practical value for real-world applications.\n- Dual Splits: Carefully partitioned into training (5,000 rows) and testing (1,030 rows) sets to facilitate effective model evaluation.",
"## Dataset Structure\n\nEach record in the dataset follows a structured format, containing the following fields:\n\n- 'title': The title of the Wikipedia article.\n- 'text': A snippet from the article related to the question.\n- 'source': The URL of the Wikipedia page.\n- 'question': A question related to the text snippet.\n- 'answer': The answer to the question.\n- 'has_answer': A boolean indicating whether the answer is present in the text snippet.",
"### Example Record",
"## Getting Started\n\nTo get started with the Sadeem QnA dataset, you can download it directly from our Huggingface repository.\nFollow the instructions there to load the dataset into your environment and begin exploring.",
"## Usage\n\nThis dataset is perfect for:\n- Training machine learning models for Arabic question answering.\n- Evaluating the performance of NLP models on Arabic text.\n- Enhancing language understanding systems with a focus on Arabic.",
"## Contributing\n\nWe welcome contributions from the community! Whether it's improving the documentation, adding more questions, or reporting issues, your help makes Sadeem QnA better for everyone.",
"## License\n\nThe Sadeem QnA dataset is available under the Apache License 2.0. We encourage its use for academic research, commercial applications, and beyond, provided proper attribution is given.\n\nIf you use the Sadeem QnA dataset in your research, please cite it using the following format:\n\n\n\nEmbark on your journey through the Arabic language with Sadeem QnA and unlock the potential of AI in understanding the complexity and beauty of Arabic text."
] | [
"TAGS\n#task_categories-question-answering #size_categories-1K<n<10K #language-Arabic #license-apache-2.0 #qna #questioning-answering #questions-generation #region-us \n",
"# Sadeem QnA: An Arabic QnA Dataset \n\nWelcome to the Sadeem QnA dataset, a vibrant collection designed for the advancement of Arabic natural language processing, specifically tailored for Question Answering (QnA) systems. Sourced from the rich and diverse content of Arabic Wikipedia, this dataset is a gateway to exploring the depths of Arabic language understanding, offering a unique challenge to both researchers and AI enthusiasts alike.",
"## Table of Contents\n\n- About Sadeem QnA\n- Dataset Structure\n- Getting Started\n- Usage\n- Contributing\n- License\n- Citation",
"## About Sadeem QnA\n\nThe Sadeem QnA dataset is crafted with the intent to foster research and development in Arabic Question Answering systems. It encompasses a broad range of topics, reflecting the rich tapestry of Arabic culture, history, and science, making it an ideal resource for training and evaluating AI models.",
"### Why Sadeem QnA?\n\n- Rich Content: Over 6,000 QnA pairs across diverse subjects.\n- Real-World Questions: Derived from actual queries people might ask, providing practical value for real-world applications.\n- Dual Splits: Carefully partitioned into training (5,000 rows) and testing (1,030 rows) sets to facilitate effective model evaluation.",
"## Dataset Structure\n\nEach record in the dataset follows a structured format, containing the following fields:\n\n- 'title': The title of the Wikipedia article.\n- 'text': A snippet from the article related to the question.\n- 'source': The URL of the Wikipedia page.\n- 'question': A question related to the text snippet.\n- 'answer': The answer to the question.\n- 'has_answer': A boolean indicating whether the answer is present in the text snippet.",
"### Example Record",
"## Getting Started\n\nTo get started with the Sadeem QnA dataset, you can download it directly from our Huggingface repository.\nFollow the instructions there to load the dataset into your environment and begin exploring.",
"## Usage\n\nThis dataset is perfect for:\n- Training machine learning models for Arabic question answering.\n- Evaluating the performance of NLP models on Arabic text.\n- Enhancing language understanding systems with a focus on Arabic.",
"## Contributing\n\nWe welcome contributions from the community! Whether it's improving the documentation, adding more questions, or reporting issues, your help makes Sadeem QnA better for everyone.",
"## License\n\nThe Sadeem QnA dataset is available under the Apache License 2.0. We encourage its use for academic research, commercial applications, and beyond, provided proper attribution is given.\n\nIf you use the Sadeem QnA dataset in your research, please cite it using the following format:\n\n\n\nEmbark on your journey through the Arabic language with Sadeem QnA and unlock the potential of AI in understanding the complexity and beauty of Arabic text."
] |
cfe7053280761c7e0135e3f9cd9c4cf33b378dd7 |
# Dataset Card for Evaluation run of abacusai/Fewshot-Metamath-OrcaVicuna-Mistral-10B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abacusai/Fewshot-Metamath-OrcaVicuna-Mistral-10B](https://huggingface.co/abacusai/Fewshot-Metamath-OrcaVicuna-Mistral-10B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abacusai__Fewshot-Metamath-OrcaVicuna-Mistral-10B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-05T04:53:01.217298](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__Fewshot-Metamath-OrcaVicuna-Mistral-10B/blob/main/results_2024-02-05T04-53-01.217298.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5886860732776018,
"acc_stderr": 0.03332678726623594,
"acc_norm": 0.5977872250403768,
"acc_norm_stderr": 0.03408055329985237,
"mc1": 0.32802937576499386,
"mc1_stderr": 0.01643563293281503,
"mc2": 0.5097747099068484,
"mc2_stderr": 0.014813899529913443
},
"harness|arc:challenge|25": {
"acc": 0.507679180887372,
"acc_stderr": 0.01460966744089257,
"acc_norm": 0.5639931740614335,
"acc_norm_stderr": 0.014491225699230916
},
"harness|hellaswag|10": {
"acc": 0.5804620593507269,
"acc_stderr": 0.00492474850063935,
"acc_norm": 0.7812188807010556,
"acc_norm_stderr": 0.0041257489882920205
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.039420826399272135,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.039420826399272135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4851063829787234,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.4851063829787234,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02510742548113729,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02510742548113729
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6,
"acc_stderr": 0.02483881198803316,
"acc_norm": 0.6,
"acc_norm_stderr": 0.02483881198803316
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7853211009174312,
"acc_stderr": 0.017604304149256483,
"acc_norm": 0.7853211009174312,
"acc_norm_stderr": 0.017604304149256483
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293433,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293433
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134987,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134987
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.02514093595033544,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.02514093595033544
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7662835249042146,
"acc_stderr": 0.015133383278988836,
"acc_norm": 0.7662835249042146,
"acc_norm_stderr": 0.015133383278988836
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.661849710982659,
"acc_stderr": 0.02546977014940017,
"acc_norm": 0.661849710982659,
"acc_norm_stderr": 0.02546977014940017
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35195530726256985,
"acc_stderr": 0.015972668523689074,
"acc_norm": 0.35195530726256985,
"acc_norm_stderr": 0.015972668523689074
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.02709865262130175,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.02709865262130175
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.026229649178821163,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.026229649178821163
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.02955545423677885,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.02955545423677885
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4211212516297262,
"acc_stderr": 0.012610325733489905,
"acc_norm": 0.4211212516297262,
"acc_norm_stderr": 0.012610325733489905
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5571895424836601,
"acc_stderr": 0.02009508315457735,
"acc_norm": 0.5571895424836601,
"acc_norm_stderr": 0.02009508315457735
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670238,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670238
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.027979823538744543,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.027979823538744543
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017204,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017204
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32802937576499386,
"mc1_stderr": 0.01643563293281503,
"mc2": 0.5097747099068484,
"mc2_stderr": 0.014813899529913443
},
"harness|winogrande|5": {
"acc": 0.7647987371744278,
"acc_stderr": 0.01192000816365088
},
"harness|gsm8k|5": {
"acc": 0.1326762699014405,
"acc_stderr": 0.009343929131442216
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_abacusai__Fewshot-Metamath-OrcaVicuna-Mistral-10B | [
"region:us"
] | 2024-02-05T04:55:18+00:00 | {"pretty_name": "Evaluation run of abacusai/Fewshot-Metamath-OrcaVicuna-Mistral-10B", "dataset_summary": "Dataset automatically created during the evaluation run of model [abacusai/Fewshot-Metamath-OrcaVicuna-Mistral-10B](https://huggingface.co/abacusai/Fewshot-Metamath-OrcaVicuna-Mistral-10B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abacusai__Fewshot-Metamath-OrcaVicuna-Mistral-10B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-05T04:53:01.217298](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__Fewshot-Metamath-OrcaVicuna-Mistral-10B/blob/main/results_2024-02-05T04-53-01.217298.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5886860732776018,\n \"acc_stderr\": 0.03332678726623594,\n \"acc_norm\": 0.5977872250403768,\n \"acc_norm_stderr\": 0.03408055329985237,\n \"mc1\": 0.32802937576499386,\n \"mc1_stderr\": 0.01643563293281503,\n \"mc2\": 0.5097747099068484,\n \"mc2_stderr\": 0.014813899529913443\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.507679180887372,\n \"acc_stderr\": 0.01460966744089257,\n \"acc_norm\": 0.5639931740614335,\n \"acc_norm_stderr\": 0.014491225699230916\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5804620593507269,\n \"acc_stderr\": 0.00492474850063935,\n \"acc_norm\": 0.7812188807010556,\n \"acc_norm_stderr\": 0.0041257489882920205\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.032671518489247764,\n \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.032671518489247764\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.35964912280701755,\n \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.02510742548113729,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02510742548113729\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7354838709677419,\n \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.7354838709677419,\n \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.02483881198803316,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.02483881198803316\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7853211009174312,\n \"acc_stderr\": 0.017604304149256483,\n \"acc_norm\": 0.7853211009174312,\n \"acc_norm_stderr\": 0.017604304149256483\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923403,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923403\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293433,\n \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293433\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n \"acc_stderr\": 0.03210062154134987,\n \"acc_norm\": 0.6457399103139013,\n \"acc_norm_stderr\": 0.03210062154134987\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n \"acc_stderr\": 0.02514093595033544,\n \"acc_norm\": 0.8205128205128205,\n \"acc_norm_stderr\": 0.02514093595033544\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7662835249042146,\n \"acc_stderr\": 0.015133383278988836,\n \"acc_norm\": 0.7662835249042146,\n \"acc_norm_stderr\": 0.015133383278988836\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.661849710982659,\n \"acc_stderr\": 0.02546977014940017,\n \"acc_norm\": 0.661849710982659,\n \"acc_norm_stderr\": 0.02546977014940017\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35195530726256985,\n \"acc_stderr\": 0.015972668523689074,\n \"acc_norm\": 0.35195530726256985,\n \"acc_norm_stderr\": 0.015972668523689074\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n \"acc_stderr\": 0.02709865262130175,\n \"acc_norm\": 0.6495176848874598,\n \"acc_norm_stderr\": 0.02709865262130175\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.026229649178821163,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.026229649178821163\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4326241134751773,\n \"acc_stderr\": 0.02955545423677885,\n \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.02955545423677885\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4211212516297262,\n \"acc_stderr\": 0.012610325733489905,\n \"acc_norm\": 0.4211212516297262,\n \"acc_norm_stderr\": 0.012610325733489905\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5571895424836601,\n \"acc_stderr\": 0.02009508315457735,\n \"acc_norm\": 0.5571895424836601,\n \"acc_norm_stderr\": 0.02009508315457735\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n \"acc_stderr\": 0.04673752333670238,\n \"acc_norm\": 0.6090909090909091,\n \"acc_norm_stderr\": 0.04673752333670238\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.027979823538744543,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.027979823538744543\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n \"acc_stderr\": 0.029475250236017204,\n \"acc_norm\": 0.7761194029850746,\n \"acc_norm_stderr\": 0.029475250236017204\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32802937576499386,\n \"mc1_stderr\": 0.01643563293281503,\n \"mc2\": 0.5097747099068484,\n \"mc2_stderr\": 0.014813899529913443\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.01192000816365088\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1326762699014405,\n \"acc_stderr\": 0.009343929131442216\n }\n}\n```", "repo_url": "https://huggingface.co/abacusai/Fewshot-Metamath-OrcaVicuna-Mistral-10B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|arc:challenge|25_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|gsm8k|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hellaswag|10_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T04-53-01.217298.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["**/details_harness|winogrande|5_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-05T04-53-01.217298.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_05T04_53_01.217298", "path": ["results_2024-02-05T04-53-01.217298.parquet"]}, {"split": "latest", "path": ["results_2024-02-05T04-53-01.217298.parquet"]}]}]} | 2024-02-05T04:55:39+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of abacusai/Fewshot-Metamath-OrcaVicuna-Mistral-10B
Dataset automatically created during the evaluation run of model abacusai/Fewshot-Metamath-OrcaVicuna-Mistral-10B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-05T04:53:01.217298(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of abacusai/Fewshot-Metamath-OrcaVicuna-Mistral-10B\n\n\n\nDataset automatically created during the evaluation run of model abacusai/Fewshot-Metamath-OrcaVicuna-Mistral-10B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T04:53:01.217298(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of abacusai/Fewshot-Metamath-OrcaVicuna-Mistral-10B\n\n\n\nDataset automatically created during the evaluation run of model abacusai/Fewshot-Metamath-OrcaVicuna-Mistral-10B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T04:53:01.217298(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
c3cac1c24e518bce2879d34d2d470fd6b1772126 | # Dataset Card for "sn18-all-20240204"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | winglian/sn18-all-20240204 | [
"region:us"
] | 2024-02-05T05:20:24+00:00 | {"dataset_info": {"features": [{"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}]}, {"name": "run_id", "dtype": "string"}, {"name": "step", "dtype": "int64"}, {"name": "uid", "dtype": "int64"}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 16095190820, "num_examples": 8595545}], "download_size": 7091700915, "dataset_size": 16095190820}} | 2024-02-05T05:27:58+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "sn18-all-20240204"
More Information needed | [
"# Dataset Card for \"sn18-all-20240204\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"sn18-all-20240204\"\n\nMore Information needed"
] |
01c18a5b06119e50e7755f93a7f121a4f303157c |
# Prompt Task Clasification
Category prompt into categories and results into the most probably task
## Current Supported Categories
```py
['fill_mask',
'conversation',
'midjourney_image_generation',
'math',
'science',
'toxic_harmful',
'logical_reasoning',
'sex',
'creative_writing']
```
## Categories Data Composition
 | jeanvydes/llm-routing-text-classification | [
"task_categories:text-classification",
"language:en",
"license:unlicense",
"prompt-classification",
"region:us"
] | 2024-02-05T05:23:04+00:00 | {"language": ["en"], "license": "unlicense", "task_categories": ["text-classification"], "pretty_name": "Prompt Task Classification", "tags": ["prompt-classification"]} | 2024-02-11T05:45:00+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-classification #language-English #license-unlicense #prompt-classification #region-us
|
# Prompt Task Clasification
Category prompt into categories and results into the most probably task
## Current Supported Categories
## Categories Data Composition
!Categories Composition | [
"# Prompt Task Clasification\n\nCategory prompt into categories and results into the most probably task",
"## Current Supported Categories",
"## Categories Data Composition\n!Categories Composition"
] | [
"TAGS\n#task_categories-text-classification #language-English #license-unlicense #prompt-classification #region-us \n",
"# Prompt Task Clasification\n\nCategory prompt into categories and results into the most probably task",
"## Current Supported Categories",
"## Categories Data Composition\n!Categories Composition"
] |
e17b0e5fa979c9d9d8bcb1232267d6e7083821a9 |
# Dataset Card for Evaluation run of xaviviro/OpenHermes-2.5-FLOR-6.3B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [xaviviro/OpenHermes-2.5-FLOR-6.3B](https://huggingface.co/xaviviro/OpenHermes-2.5-FLOR-6.3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xaviviro__OpenHermes-2.5-FLOR-6.3B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-05T06:45:07.884886](https://huggingface.co/datasets/open-llm-leaderboard/details_xaviviro__OpenHermes-2.5-FLOR-6.3B/blob/main/results_2024-02-05T06-45-07.884886.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2564256649570238,
"acc_stderr": 0.03083065464492649,
"acc_norm": 0.25821512637036736,
"acc_norm_stderr": 0.031658317147635937,
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752329,
"mc2": 0.46121758095208715,
"mc2_stderr": 0.015532516852005473
},
"harness|arc:challenge|25": {
"acc": 0.28498293515358364,
"acc_stderr": 0.013191348179838793,
"acc_norm": 0.33447098976109213,
"acc_norm_stderr": 0.013787460322441377
},
"harness|hellaswag|10": {
"acc": 0.37223660625373434,
"acc_stderr": 0.004824130528590593,
"acc_norm": 0.545309699263095,
"acc_norm_stderr": 0.004969251445596338
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073466,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073466
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19078947368421054,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.19078947368421054,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.025447863825108625,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.025447863825108625
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.03063114553919882,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.03063114553919882
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2936170212765957,
"acc_stderr": 0.029771642712491223,
"acc_norm": 0.2936170212765957,
"acc_norm_stderr": 0.029771642712491223
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400175,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400175
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.035670166752768635,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.035670166752768635
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1870967741935484,
"acc_stderr": 0.022185710092252252,
"acc_norm": 0.1870967741935484,
"acc_norm_stderr": 0.022185710092252252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.13793103448275862,
"acc_stderr": 0.024261984301044582,
"acc_norm": 0.13793103448275862,
"acc_norm_stderr": 0.024261984301044582
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18686868686868688,
"acc_stderr": 0.027772533334218977,
"acc_norm": 0.18686868686868688,
"acc_norm_stderr": 0.027772533334218977
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23316062176165803,
"acc_stderr": 0.030516111371476008,
"acc_norm": 0.23316062176165803,
"acc_norm_stderr": 0.030516111371476008
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21025641025641026,
"acc_stderr": 0.020660597485026952,
"acc_norm": 0.21025641025641026,
"acc_norm_stderr": 0.020660597485026952
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.02488211685765509,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.02488211685765509
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868952,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868952
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1963302752293578,
"acc_stderr": 0.017030719339154368,
"acc_norm": 0.1963302752293578,
"acc_norm_stderr": 0.017030719339154368
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.027696910713093936,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.027696910713093936
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.033086111132364336,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.033086111132364336
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.030685820596610812,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.030685820596610812
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3183856502242152,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.3183856502242152,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.27607361963190186,
"acc_stderr": 0.0351238528370505,
"acc_norm": 0.27607361963190186,
"acc_norm_stderr": 0.0351238528370505
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.33760683760683763,
"acc_stderr": 0.030980296992618558,
"acc_norm": 0.33760683760683763,
"acc_norm_stderr": 0.030980296992618558
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2567049808429119,
"acc_stderr": 0.015620480263064533,
"acc_norm": 0.2567049808429119,
"acc_norm_stderr": 0.015620480263064533
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.023618678310069374,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.023618678310069374
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19935691318327975,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.19935691318327975,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0227797190887334,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0227797190887334
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.025123739226872405,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.025123739226872405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25488917861799215,
"acc_stderr": 0.01113050981266298,
"acc_norm": 0.25488917861799215,
"acc_norm_stderr": 0.01113050981266298
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20955882352941177,
"acc_stderr": 0.02472311040767707,
"acc_norm": 0.20955882352941177,
"acc_norm_stderr": 0.02472311040767707
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.017630827375148383,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.017630827375148383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409217,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409217
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.32748538011695905,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.32748538011695905,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752329,
"mc2": 0.46121758095208715,
"mc2_stderr": 0.015532516852005473
},
"harness|winogrande|5": {
"acc": 0.6298342541436464,
"acc_stderr": 0.013570454689603911
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_xaviviro__OpenHermes-2.5-FLOR-6.3B | [
"region:us"
] | 2024-02-05T06:47:36+00:00 | {"pretty_name": "Evaluation run of xaviviro/OpenHermes-2.5-FLOR-6.3B", "dataset_summary": "Dataset automatically created during the evaluation run of model [xaviviro/OpenHermes-2.5-FLOR-6.3B](https://huggingface.co/xaviviro/OpenHermes-2.5-FLOR-6.3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xaviviro__OpenHermes-2.5-FLOR-6.3B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-05T06:45:07.884886](https://huggingface.co/datasets/open-llm-leaderboard/details_xaviviro__OpenHermes-2.5-FLOR-6.3B/blob/main/results_2024-02-05T06-45-07.884886.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2564256649570238,\n \"acc_stderr\": 0.03083065464492649,\n \"acc_norm\": 0.25821512637036736,\n \"acc_norm_stderr\": 0.031658317147635937,\n \"mc1\": 0.24112607099143207,\n \"mc1_stderr\": 0.014974827279752329,\n \"mc2\": 0.46121758095208715,\n \"mc2_stderr\": 0.015532516852005473\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.28498293515358364,\n \"acc_stderr\": 0.013191348179838793,\n \"acc_norm\": 0.33447098976109213,\n \"acc_norm_stderr\": 0.013787460322441377\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.37223660625373434,\n \"acc_stderr\": 0.004824130528590593,\n \"acc_norm\": 0.545309699263095,\n \"acc_norm_stderr\": 0.004969251445596338\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.03633384414073466,\n \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.03633384414073466\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.19078947368421054,\n \"acc_stderr\": 0.031975658210325,\n \"acc_norm\": 0.19078947368421054,\n \"acc_norm_stderr\": 0.031975658210325\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.025447863825108625,\n \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.025447863825108625\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n \"acc_stderr\": 0.03063114553919882,\n \"acc_norm\": 0.2023121387283237,\n \"acc_norm_stderr\": 0.03063114553919882\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2936170212765957,\n \"acc_stderr\": 0.029771642712491223,\n \"acc_norm\": 0.2936170212765957,\n \"acc_norm_stderr\": 0.029771642712491223\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400175,\n \"acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400175\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n \"acc_stderr\": 0.035670166752768635,\n \"acc_norm\": 0.1984126984126984,\n \"acc_norm_stderr\": 0.035670166752768635\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1870967741935484,\n \"acc_stderr\": 0.022185710092252252,\n \"acc_norm\": 0.1870967741935484,\n \"acc_norm_stderr\": 0.022185710092252252\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.13793103448275862,\n \"acc_stderr\": 0.024261984301044582,\n \"acc_norm\": 0.13793103448275862,\n \"acc_norm_stderr\": 0.024261984301044582\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.18686868686868688,\n \"acc_stderr\": 0.027772533334218977,\n \"acc_norm\": 0.18686868686868688,\n \"acc_norm_stderr\": 0.027772533334218977\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.23316062176165803,\n \"acc_stderr\": 0.030516111371476008,\n \"acc_norm\": 0.23316062176165803,\n \"acc_norm_stderr\": 0.030516111371476008\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.21025641025641026,\n \"acc_stderr\": 0.020660597485026952,\n \"acc_norm\": 0.21025641025641026,\n \"acc_norm_stderr\": 0.020660597485026952\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.02488211685765509,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.02488211685765509\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868952,\n \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868952\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1963302752293578,\n \"acc_stderr\": 0.017030719339154368,\n \"acc_norm\": 0.1963302752293578,\n \"acc_norm_stderr\": 0.017030719339154368\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.20833333333333334,\n \"acc_stderr\": 0.027696910713093936,\n \"acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.027696910713093936\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.033086111132364336,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.033086111132364336\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.030685820596610812,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.030685820596610812\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3183856502242152,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.3183856502242152,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.256198347107438,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\": 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.27607361963190186,\n \"acc_stderr\": 0.0351238528370505,\n \"acc_norm\": 0.27607361963190186,\n \"acc_norm_stderr\": 0.0351238528370505\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.33760683760683763,\n \"acc_stderr\": 0.030980296992618558,\n \"acc_norm\": 0.33760683760683763,\n \"acc_norm_stderr\": 0.030980296992618558\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2567049808429119,\n \"acc_stderr\": 0.015620480263064533,\n \"acc_norm\": 0.2567049808429119,\n \"acc_norm_stderr\": 0.015620480263064533\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.26011560693641617,\n \"acc_stderr\": 0.023618678310069374,\n \"acc_norm\": 0.26011560693641617,\n \"acc_norm_stderr\": 0.023618678310069374\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19935691318327975,\n \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.19935691318327975,\n \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.0227797190887334,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.0227797190887334\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23049645390070922,\n \"acc_stderr\": 0.025123739226872405,\n \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.025123739226872405\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25488917861799215,\n \"acc_stderr\": 0.01113050981266298,\n \"acc_norm\": 0.25488917861799215,\n \"acc_norm_stderr\": 0.01113050981266298\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.20955882352941177,\n \"acc_stderr\": 0.02472311040767707,\n \"acc_norm\": 0.20955882352941177,\n \"acc_norm_stderr\": 0.02472311040767707\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.017630827375148383,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.017630827375148383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.2818181818181818,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.030147775935409217,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.030147775935409217\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.32748538011695905,\n \"acc_stderr\": 0.035993357714560276,\n \"acc_norm\": 0.32748538011695905,\n \"acc_norm_stderr\": 0.035993357714560276\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24112607099143207,\n \"mc1_stderr\": 0.014974827279752329,\n \"mc2\": 0.46121758095208715,\n \"mc2_stderr\": 0.015532516852005473\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6298342541436464,\n \"acc_stderr\": 0.013570454689603911\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/xaviviro/OpenHermes-2.5-FLOR-6.3B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|arc:challenge|25_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|gsm8k|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hellaswag|10_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T06-45-07.884886.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["**/details_harness|winogrande|5_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-05T06-45-07.884886.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_05T06_45_07.884886", "path": ["results_2024-02-05T06-45-07.884886.parquet"]}, {"split": "latest", "path": ["results_2024-02-05T06-45-07.884886.parquet"]}]}]} | 2024-02-05T06:47:59+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of xaviviro/OpenHermes-2.5-FLOR-6.3B
Dataset automatically created during the evaluation run of model xaviviro/OpenHermes-2.5-FLOR-6.3B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-05T06:45:07.884886(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of xaviviro/OpenHermes-2.5-FLOR-6.3B\n\n\n\nDataset automatically created during the evaluation run of model xaviviro/OpenHermes-2.5-FLOR-6.3B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T06:45:07.884886(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of xaviviro/OpenHermes-2.5-FLOR-6.3B\n\n\n\nDataset automatically created during the evaluation run of model xaviviro/OpenHermes-2.5-FLOR-6.3B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T06:45:07.884886(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
2ef6b982a22f18abb38187939a60123af2c53fa7 |
# Dataset Card for Evaluation run of indischepartij/OpenMia-Indo-Mistral-7b-v3-refined
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [indischepartij/OpenMia-Indo-Mistral-7b-v3-refined](https://huggingface.co/indischepartij/OpenMia-Indo-Mistral-7b-v3-refined) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b-v3-refined",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-05T07:17:24.697764](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b-v3-refined/blob/main/results_2024-02-05T07-17-24.697764.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6291464441848997,
"acc_stderr": 0.0325052901080867,
"acc_norm": 0.6303599490256852,
"acc_norm_stderr": 0.03317036353110983,
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5394835979252358,
"mc2_stderr": 0.015320518165942699
},
"harness|arc:challenge|25": {
"acc": 0.6049488054607508,
"acc_stderr": 0.014285898292938167,
"acc_norm": 0.64419795221843,
"acc_norm_stderr": 0.01399057113791876
},
"harness|hellaswag|10": {
"acc": 0.640211113324039,
"acc_stderr": 0.004789575163418651,
"acc_norm": 0.842162915753834,
"acc_norm_stderr": 0.0036384306206139337
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.02490699045899257,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.02490699045899257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922986,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922986
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.03095663632856655,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.03095663632856655
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.016659279700295827,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.016659279700295827
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.040655781409087044,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.040655781409087044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.013853724170922533,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.013853724170922533
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26256983240223464,
"acc_stderr": 0.014716824273017765,
"acc_norm": 0.26256983240223464,
"acc_norm_stderr": 0.014716824273017765
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.025630824975621355,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.025630824975621355
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873862,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873862
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045704,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045704
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.019488025745529675,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.019488025745529675
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368043,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368043
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5394835979252358,
"mc2_stderr": 0.015320518165942699
},
"harness|winogrande|5": {
"acc": 0.8153117600631413,
"acc_stderr": 0.01090597811215688
},
"harness|gsm8k|5": {
"acc": 0.6125852918877938,
"acc_stderr": 0.01341879844782737
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b-v3-refined | [
"region:us"
] | 2024-02-05T07:19:43+00:00 | {"pretty_name": "Evaluation run of indischepartij/OpenMia-Indo-Mistral-7b-v3-refined", "dataset_summary": "Dataset automatically created during the evaluation run of model [indischepartij/OpenMia-Indo-Mistral-7b-v3-refined](https://huggingface.co/indischepartij/OpenMia-Indo-Mistral-7b-v3-refined) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b-v3-refined\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-05T07:17:24.697764](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b-v3-refined/blob/main/results_2024-02-05T07-17-24.697764.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6291464441848997,\n \"acc_stderr\": 0.0325052901080867,\n \"acc_norm\": 0.6303599490256852,\n \"acc_norm_stderr\": 0.03317036353110983,\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5394835979252358,\n \"mc2_stderr\": 0.015320518165942699\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6049488054607508,\n \"acc_stderr\": 0.014285898292938167,\n \"acc_norm\": 0.64419795221843,\n \"acc_norm_stderr\": 0.01399057113791876\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.640211113324039,\n \"acc_stderr\": 0.004789575163418651,\n \"acc_norm\": 0.842162915753834,\n \"acc_norm_stderr\": 0.0036384306206139337\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922986,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922986\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.03095663632856655,\n \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.03095663632856655\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295827,\n \"acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295827\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.040655781409087044,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.040655781409087044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n \"acc_stderr\": 0.013853724170922533,\n \"acc_norm\": 0.8160919540229885,\n \"acc_norm_stderr\": 0.013853724170922533\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26256983240223464,\n \"acc_stderr\": 0.014716824273017765,\n \"acc_norm\": 0.26256983240223464,\n \"acc_norm_stderr\": 0.014716824273017765\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.025630824975621355,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.025630824975621355\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873862,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873862\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n \"acc_stderr\": 0.012739711554045704,\n \"acc_norm\": 0.4654498044328553,\n \"acc_norm_stderr\": 0.012739711554045704\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.019488025745529675,\n \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.019488025745529675\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368043,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368043\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5394835979252358,\n \"mc2_stderr\": 0.015320518165942699\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8153117600631413,\n \"acc_stderr\": 0.01090597811215688\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6125852918877938,\n \"acc_stderr\": 0.01341879844782737\n }\n}\n```", "repo_url": "https://huggingface.co/indischepartij/OpenMia-Indo-Mistral-7b-v3-refined", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|arc:challenge|25_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|gsm8k|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hellaswag|10_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T07-17-24.697764.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["**/details_harness|winogrande|5_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-05T07-17-24.697764.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_05T07_17_24.697764", "path": ["results_2024-02-05T07-17-24.697764.parquet"]}, {"split": "latest", "path": ["results_2024-02-05T07-17-24.697764.parquet"]}]}]} | 2024-02-05T07:20:05+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of indischepartij/OpenMia-Indo-Mistral-7b-v3-refined
Dataset automatically created during the evaluation run of model indischepartij/OpenMia-Indo-Mistral-7b-v3-refined on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-05T07:17:24.697764(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of indischepartij/OpenMia-Indo-Mistral-7b-v3-refined\n\n\n\nDataset automatically created during the evaluation run of model indischepartij/OpenMia-Indo-Mistral-7b-v3-refined on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T07:17:24.697764(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of indischepartij/OpenMia-Indo-Mistral-7b-v3-refined\n\n\n\nDataset automatically created during the evaluation run of model indischepartij/OpenMia-Indo-Mistral-7b-v3-refined on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T07:17:24.697764(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
18235edc40100a2027481a633266576d44240943 |
# Dataset Card for Evaluation run of indischepartij/OpenMia-Indo-Engineering-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [indischepartij/OpenMia-Indo-Engineering-7b](https://huggingface.co/indischepartij/OpenMia-Indo-Engineering-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Engineering-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-05T07:22:15.500441](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Engineering-7b/blob/main/results_2024-02-05T07-22-15.500441.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6324060235996815,
"acc_stderr": 0.03231686157638383,
"acc_norm": 0.6331079164519123,
"acc_norm_stderr": 0.032980184119772604,
"mc1": 0.4186046511627907,
"mc1_stderr": 0.01727001528447686,
"mc2": 0.5793947082847677,
"mc2_stderr": 0.01530573457723597
},
"harness|arc:challenge|25": {
"acc": 0.6245733788395904,
"acc_stderr": 0.014150631435111728,
"acc_norm": 0.6715017064846417,
"acc_norm_stderr": 0.0137249784655373
},
"harness|hellaswag|10": {
"acc": 0.6482772356104362,
"acc_stderr": 0.004765320784902126,
"acc_norm": 0.8501294562836088,
"acc_norm_stderr": 0.0035621498909627174
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424649,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424649
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.02805779167298902,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.02805779167298902
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.02439667298509476,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.02439667298509476
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608456,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608456
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.03149930577784906,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.03149930577784906
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010347,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010347
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671632,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671632
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601436,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.03036037971029195,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.03036037971029195
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.01377869377846408,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.01377869377846408
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3027932960893855,
"acc_stderr": 0.015366860386397112,
"acc_norm": 0.3027932960893855,
"acc_norm_stderr": 0.015366860386397112
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279056,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279056
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.02548311560119545,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.02548311560119545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236848,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015062,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015062
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.029227192460032025,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.029227192460032025
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.019542101564854125,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.019542101564854125
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4186046511627907,
"mc1_stderr": 0.01727001528447686,
"mc2": 0.5793947082847677,
"mc2_stderr": 0.01530573457723597
},
"harness|winogrande|5": {
"acc": 0.8232044198895028,
"acc_stderr": 0.010721923287918747
},
"harness|gsm8k|5": {
"acc": 0.6489764973464746,
"acc_stderr": 0.013146945941397222
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Engineering-7b | [
"region:us"
] | 2024-02-05T07:24:38+00:00 | {"pretty_name": "Evaluation run of indischepartij/OpenMia-Indo-Engineering-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [indischepartij/OpenMia-Indo-Engineering-7b](https://huggingface.co/indischepartij/OpenMia-Indo-Engineering-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Engineering-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-05T07:22:15.500441](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Engineering-7b/blob/main/results_2024-02-05T07-22-15.500441.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6324060235996815,\n \"acc_stderr\": 0.03231686157638383,\n \"acc_norm\": 0.6331079164519123,\n \"acc_norm_stderr\": 0.032980184119772604,\n \"mc1\": 0.4186046511627907,\n \"mc1_stderr\": 0.01727001528447686,\n \"mc2\": 0.5793947082847677,\n \"mc2_stderr\": 0.01530573457723597\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111728,\n \"acc_norm\": 0.6715017064846417,\n \"acc_norm_stderr\": 0.0137249784655373\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6482772356104362,\n \"acc_stderr\": 0.004765320784902126,\n \"acc_norm\": 0.8501294562836088,\n \"acc_norm_stderr\": 0.0035621498909627174\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.032650194750335815,\n \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.032650194750335815\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.02805779167298902,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.02805779167298902\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.02439667298509476,\n \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.02439667298509476\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608456,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608456\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.03149930577784906,\n \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.03149930577784906\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010347,\n \"acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010347\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671632,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671632\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601436,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601436\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n \"acc_stderr\": 0.03036037971029195,\n \"acc_norm\": 0.7130044843049327,\n \"acc_norm_stderr\": 0.03036037971029195\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n \"acc_stderr\": 0.01377869377846408,\n \"acc_norm\": 0.8186462324393359,\n \"acc_norm_stderr\": 0.01377869377846408\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3027932960893855,\n \"acc_stderr\": 0.015366860386397112,\n \"acc_norm\": 0.3027932960893855,\n \"acc_norm_stderr\": 0.015366860386397112\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279056,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279056\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.02548311560119545,\n \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.02548311560119545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236848,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236848\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n \"acc_stderr\": 0.012751075788015062,\n \"acc_norm\": 0.4726205997392438,\n \"acc_norm_stderr\": 0.012751075788015062\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.029227192460032025,\n \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.029227192460032025\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854125,\n \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854125\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4186046511627907,\n \"mc1_stderr\": 0.01727001528447686,\n \"mc2\": 0.5793947082847677,\n \"mc2_stderr\": 0.01530573457723597\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.010721923287918747\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6489764973464746,\n \"acc_stderr\": 0.013146945941397222\n }\n}\n```", "repo_url": "https://huggingface.co/indischepartij/OpenMia-Indo-Engineering-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|arc:challenge|25_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|gsm8k|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hellaswag|10_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T07-22-15.500441.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["**/details_harness|winogrande|5_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-05T07-22-15.500441.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_05T07_22_15.500441", "path": ["results_2024-02-05T07-22-15.500441.parquet"]}, {"split": "latest", "path": ["results_2024-02-05T07-22-15.500441.parquet"]}]}]} | 2024-02-05T07:25:00+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of indischepartij/OpenMia-Indo-Engineering-7b
Dataset automatically created during the evaluation run of model indischepartij/OpenMia-Indo-Engineering-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-05T07:22:15.500441(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of indischepartij/OpenMia-Indo-Engineering-7b\n\n\n\nDataset automatically created during the evaluation run of model indischepartij/OpenMia-Indo-Engineering-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T07:22:15.500441(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of indischepartij/OpenMia-Indo-Engineering-7b\n\n\n\nDataset automatically created during the evaluation run of model indischepartij/OpenMia-Indo-Engineering-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T07:22:15.500441(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
9523dde46f0538f8fc532a5c1e2478ffb96330a6 |
# Dataset Card for UltraChat 200k
This is just the [original](https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k) ultrachat 200k dataset converted to sharegpt format.
## Dataset Description
This is a heavily filtered version of the [UltraChat](https://github.com/thunlp/UltraChat) dataset and was used to train [Zephyr-7B-β](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta), a state of the art 7b chat model.
The original datasets consists of 1.4M dialogues generated by ChatGPT and spanning a wide range of topics. To create `UltraChat 200k`, we applied the following logic:
- Selection of a subset of data for faster supervised fine tuning.
- Truecasing of the dataset, as we observed around 5% of the data contained grammatical errors like "Hello. how are you?" instead of "Hello. How are you?"
- Removal of dialogues where the assistant replies with phrases like "I do not have emotions" or "I don't have opinions", even for fact-based prompts that don't involve either.
## Dataset Structure
The dataset has four splits, suitable for:
* Supervised fine-tuning (`sft`).
* Generation ranking (`gen`) via techniques like rejection sampling or PPO.
The number of examples per split is shown as follows:
| train_sft | test_sft | train_gen | test_gen |
|:-------:|:-----------:|:-----:| :-----:|
| 207865 | 23110 | 256032 | 28304 |
The dataset is stored in parquet format with each entry using the following schema:
```
{
"prompt": "Create a fully-developed protagonist who is challenged to survive within a dystopian society under the rule of a tyrant. ...",
"messages":[
{
"content": "Create a fully-developed protagonist who is challenged to survive within a dystopian society under the rule of a tyrant. ...",
"role": "user"
},
{
"content": "Name: Ava\n\n Ava was just 16 years old when the world as she knew it came crashing down. The government had collapsed, leaving behind a chaotic and lawless society. ...",
"role": "assistant"
},
{
"content": "Wow, Ava's story is so intense and inspiring! Can you provide me with more details. ...",
"role": "user"
},
{
"content": "Certainly! ....",
"role": "assistant"
},
{
"content": "That's really interesting! I would love to hear more...",
"role": "user"
}
{
"content": "Certainly! ....",
"role": "assistant"
},
],
"prompt_id": "d938b65dfe31f05f80eb8572964c6673eddbd68eff3db6bd234d7f1e3b86c2af"
}
```
## Citation
If you find this dataset is useful in your work, please cite the original UltraChat dataset:
```
@misc{ding2023enhancing,
title={Enhancing Chat Language Models by Scaling High-quality Instructional Conversations},
author={Ning Ding and Yulin Chen and Bokai Xu and Yujia Qin and Zhi Zheng and Shengding Hu and Zhiyuan Liu and Maosong Sun and Bowen Zhou},
year={2023},
eprint={2305.14233},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
You may also wish to cite the Zephyr 7B technical report:
```
@misc{tunstall2023zephyr,
title={Zephyr: Direct Distillation of LM Alignment},
author={Lewis Tunstall and Edward Beeching and Nathan Lambert and Nazneen Rajani and Kashif Rasul and Younes Belkada and Shengyi Huang and Leandro von Werra and Clémentine Fourrier and Nathan Habib and Nathan Sarrazin and Omar Sanseviero and Alexander M. Rush and Thomas Wolf},
year={2023},
eprint={2310.16944},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
| abhinand/ultrachat_200k_sharegpt | [
"language:en",
"arxiv:2305.14233",
"arxiv:2310.16944",
"region:us"
] | 2024-02-05T07:32:24+00:00 | {"language": ["en"], "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 1393769584, "num_examples": 207865}, {"name": "test", "num_bytes": 154329899, "num_examples": 23110}], "download_size": 813245371, "dataset_size": 1548099483}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-09T13:53:16+00:00 | [
"2305.14233",
"2310.16944"
] | [
"en"
] | TAGS
#language-English #arxiv-2305.14233 #arxiv-2310.16944 #region-us
| Dataset Card for UltraChat 200k
===============================
This is just the original ultrachat 200k dataset converted to sharegpt format.
Dataset Description
-------------------
This is a heavily filtered version of the UltraChat dataset and was used to train Zephyr-7B-β, a state of the art 7b chat model.
The original datasets consists of 1.4M dialogues generated by ChatGPT and spanning a wide range of topics. To create 'UltraChat 200k', we applied the following logic:
* Selection of a subset of data for faster supervised fine tuning.
* Truecasing of the dataset, as we observed around 5% of the data contained grammatical errors like "Hello. how are you?" instead of "Hello. How are you?"
* Removal of dialogues where the assistant replies with phrases like "I do not have emotions" or "I don't have opinions", even for fact-based prompts that don't involve either.
Dataset Structure
-----------------
The dataset has four splits, suitable for:
* Supervised fine-tuning ('sft').
* Generation ranking ('gen') via techniques like rejection sampling or PPO.
The number of examples per split is shown as follows:
The dataset is stored in parquet format with each entry using the following schema:
If you find this dataset is useful in your work, please cite the original UltraChat dataset:
You may also wish to cite the Zephyr 7B technical report:
| [] | [
"TAGS\n#language-English #arxiv-2305.14233 #arxiv-2310.16944 #region-us \n"
] |
6cea60897b15ff41a1064daa21e2d9eae7d4ec3f | The Persian portion of the original CommonVoice 13 dataset at https://huggingface.co/datasets/mozilla-foundation/common_voice_13_0
#### Load
```python
# Using HF Datasets
from datasets import load_dataset
dataset = load_dataset("hezarai/common-voice-13-fa", split="train")
# Using Hezar
from hezar.data import Dataset
dataset = Dataset.load("hezarai/common-voice-13-fa", split="train")
``` | hezarai/common-voice-13-fa | [
"task_categories:automatic-speech-recognition",
"size_categories:10K<n<100K",
"language:fa",
"hezar",
"region:us"
] | 2024-02-05T08:02:32+00:00 | {"language": ["fa"], "size_categories": ["10K<n<100K"], "task_categories": ["automatic-speech-recognition"], "pretty_name": "CommonVoice 13 (Persian)", "tags": ["hezar"]} | 2024-02-08T10:44:30+00:00 | [] | [
"fa"
] | TAGS
#task_categories-automatic-speech-recognition #size_categories-10K<n<100K #language-Persian #hezar #region-us
| The Persian portion of the original CommonVoice 13 dataset at URL
#### Load
| [
"#### Load"
] | [
"TAGS\n#task_categories-automatic-speech-recognition #size_categories-10K<n<100K #language-Persian #hezar #region-us \n",
"#### Load"
] |
2189fa059682d79faa07f37a1c628be49873d33b |
# Dataset Card for Evaluation run of ZoidBB/MultiKory-0.1-4x11b-pre1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ZoidBB/MultiKory-0.1-4x11b-pre1](https://huggingface.co/ZoidBB/MultiKory-0.1-4x11b-pre1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ZoidBB__MultiKory-0.1-4x11b-pre1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-05T08:07:31.524035](https://huggingface.co/datasets/open-llm-leaderboard/details_ZoidBB__MultiKory-0.1-4x11b-pre1/blob/main/results_2024-02-05T08-07-31.524035.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6506701138513552,
"acc_stderr": 0.032251654043750994,
"acc_norm": 0.6513642747647624,
"acc_norm_stderr": 0.032921546302611786,
"mc1": 0.5214198286413708,
"mc1_stderr": 0.01748743214471164,
"mc2": 0.6767314029001894,
"mc2_stderr": 0.01524427540483159
},
"harness|arc:challenge|25": {
"acc": 0.7090443686006825,
"acc_stderr": 0.013273077865907588,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545803
},
"harness|hellaswag|10": {
"acc": 0.6954789882493527,
"acc_stderr": 0.004592637369905791,
"acc_norm": 0.879008165704043,
"acc_norm_stderr": 0.0032545129328064
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.025506481698138208,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.025506481698138208
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.02956070739246572,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.02956070739246572
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092434,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579825,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265023,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265023
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.44692737430167595,
"acc_stderr": 0.016628030039647614,
"acc_norm": 0.44692737430167595,
"acc_norm_stderr": 0.016628030039647614
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.02465968518596728,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.02465968518596728
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4634941329856584,
"acc_stderr": 0.012736153390214961,
"acc_norm": 0.4634941329856584,
"acc_norm_stderr": 0.012736153390214961
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.01909422816700033,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.01909422816700033
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304324,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304324
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5214198286413708,
"mc1_stderr": 0.01748743214471164,
"mc2": 0.6767314029001894,
"mc2_stderr": 0.01524427540483159
},
"harness|winogrande|5": {
"acc": 0.8539857932123125,
"acc_stderr": 0.009924440374585243
},
"harness|gsm8k|5": {
"acc": 0.6095526914329037,
"acc_stderr": 0.01343782986466858
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ZoidBB__MultiKory-0.1-4x11b-pre1 | [
"region:us"
] | 2024-02-05T08:09:52+00:00 | {"pretty_name": "Evaluation run of ZoidBB/MultiKory-0.1-4x11b-pre1", "dataset_summary": "Dataset automatically created during the evaluation run of model [ZoidBB/MultiKory-0.1-4x11b-pre1](https://huggingface.co/ZoidBB/MultiKory-0.1-4x11b-pre1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ZoidBB__MultiKory-0.1-4x11b-pre1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-05T08:07:31.524035](https://huggingface.co/datasets/open-llm-leaderboard/details_ZoidBB__MultiKory-0.1-4x11b-pre1/blob/main/results_2024-02-05T08-07-31.524035.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6506701138513552,\n \"acc_stderr\": 0.032251654043750994,\n \"acc_norm\": 0.6513642747647624,\n \"acc_norm_stderr\": 0.032921546302611786,\n \"mc1\": 0.5214198286413708,\n \"mc1_stderr\": 0.01748743214471164,\n \"mc2\": 0.6767314029001894,\n \"mc2_stderr\": 0.01524427540483159\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7090443686006825,\n \"acc_stderr\": 0.013273077865907588,\n \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545803\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6954789882493527,\n \"acc_stderr\": 0.004592637369905791,\n \"acc_norm\": 0.879008165704043,\n \"acc_norm_stderr\": 0.0032545129328064\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138208,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138208\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37777777777777777,\n \"acc_stderr\": 0.02956070739246572,\n \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.02956070739246572\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092434,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092434\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n \"acc_stderr\": 0.013740797258579825,\n \"acc_norm\": 0.8199233716475096,\n \"acc_norm_stderr\": 0.013740797258579825\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265023,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265023\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44692737430167595,\n \"acc_stderr\": 0.016628030039647614,\n \"acc_norm\": 0.44692737430167595,\n \"acc_norm_stderr\": 0.016628030039647614\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.02465968518596728,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.02465968518596728\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n \"acc_stderr\": 0.012736153390214961,\n \"acc_norm\": 0.4634941329856584,\n \"acc_norm_stderr\": 0.012736153390214961\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.01909422816700033,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.01909422816700033\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304324,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304324\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5214198286413708,\n \"mc1_stderr\": 0.01748743214471164,\n \"mc2\": 0.6767314029001894,\n \"mc2_stderr\": 0.01524427540483159\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8539857932123125,\n \"acc_stderr\": 0.009924440374585243\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6095526914329037,\n \"acc_stderr\": 0.01343782986466858\n }\n}\n```", "repo_url": "https://huggingface.co/ZoidBB/MultiKory-0.1-4x11b-pre1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|arc:challenge|25_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|gsm8k|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hellaswag|10_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T08-07-31.524035.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["**/details_harness|winogrande|5_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-05T08-07-31.524035.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_05T08_07_31.524035", "path": ["results_2024-02-05T08-07-31.524035.parquet"]}, {"split": "latest", "path": ["results_2024-02-05T08-07-31.524035.parquet"]}]}]} | 2024-02-05T08:10:16+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ZoidBB/MultiKory-0.1-4x11b-pre1
Dataset automatically created during the evaluation run of model ZoidBB/MultiKory-0.1-4x11b-pre1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-05T08:07:31.524035(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ZoidBB/MultiKory-0.1-4x11b-pre1\n\n\n\nDataset automatically created during the evaluation run of model ZoidBB/MultiKory-0.1-4x11b-pre1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T08:07:31.524035(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ZoidBB/MultiKory-0.1-4x11b-pre1\n\n\n\nDataset automatically created during the evaluation run of model ZoidBB/MultiKory-0.1-4x11b-pre1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T08:07:31.524035(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
32e73ffb31aa55c9b6ca57c34c9dd96d2639e297 |
This dataset was intended to to be used for finetuning the nepali text summerization task.
Feel free to contribute to this readme to add any information | sanjeev-bhandari01/nepali-summarization-dataset | [
"task_categories:summarization",
"task_categories:text2text-generation",
"language:ne",
"license:wtfpl",
"nepali",
"np",
"nepali-text-summerizatio",
"article-title",
"region:us"
] | 2024-02-05T08:33:30+00:00 | {"language": ["ne"], "license": "wtfpl", "task_categories": ["summarization", "text2text-generation"], "pretty_name": "Nepali summerizatio dataset ", "tags": ["nepali", "np", "nepali-text-summerizatio", "article-title"]} | 2024-02-09T11:27:38+00:00 | [] | [
"ne"
] | TAGS
#task_categories-summarization #task_categories-text2text-generation #language-Nepali (macrolanguage) #license-wtfpl #nepali #np #nepali-text-summerizatio #article-title #region-us
|
This dataset was intended to to be used for finetuning the nepali text summerization task.
Feel free to contribute to this readme to add any information | [] | [
"TAGS\n#task_categories-summarization #task_categories-text2text-generation #language-Nepali (macrolanguage) #license-wtfpl #nepali #np #nepali-text-summerizatio #article-title #region-us \n"
] |
3dc112587f6e508c01ef29bb9d3b599f56006bf2 |
# Dataset Card for Evaluation run of dddsaty/SOLAR_Merge_Adapter_DPO_Orca
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [dddsaty/SOLAR_Merge_Adapter_DPO_Orca](https://huggingface.co/dddsaty/SOLAR_Merge_Adapter_DPO_Orca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dddsaty__SOLAR_Merge_Adapter_DPO_Orca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-05T08:48:15.938281](https://huggingface.co/datasets/open-llm-leaderboard/details_dddsaty__SOLAR_Merge_Adapter_DPO_Orca/blob/main/results_2024-02-05T08-48-15.938281.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6327439375185642,
"acc_stderr": 0.032443250608216324,
"acc_norm": 0.6355200172658205,
"acc_norm_stderr": 0.03310341022715256,
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836882,
"mc2": 0.51488245253393,
"mc2_stderr": 0.015188854393420268
},
"harness|arc:challenge|25": {
"acc": 0.6109215017064846,
"acc_stderr": 0.014247309976045607,
"acc_norm": 0.6390784982935154,
"acc_norm_stderr": 0.014034761386175452
},
"harness|hellaswag|10": {
"acc": 0.6499701254730134,
"acc_stderr": 0.004760041843651493,
"acc_norm": 0.8458474407488548,
"acc_norm_stderr": 0.0036035695286784114
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03745554791462456,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03745554791462456
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099522,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099522
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.04082482904638628,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04082482904638628
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944423,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944423
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.02518900666021238,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.02518900666021238
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.03486731727419872,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.03486731727419872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8282828282828283,
"acc_stderr": 0.0268697161874299,
"acc_norm": 0.8282828282828283,
"acc_norm_stderr": 0.0268697161874299
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593566,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593566
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.02475600038213095,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.02475600038213095
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.029723278961476664,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.029723278961476664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135353,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135353
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659809,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659809
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.016595259710399293,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.016595259710399293
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8725490196078431,
"acc_stderr": 0.02340553048084631,
"acc_norm": 0.8725490196078431,
"acc_norm_stderr": 0.02340553048084631
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.030500283176545847,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.030500283176545847
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281372,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579828,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579828
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069716,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35195530726256985,
"acc_stderr": 0.01597266852368907,
"acc_norm": 0.35195530726256985,
"acc_norm_stderr": 0.01597266852368907
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.02456922360046085,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.02456922360046085
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829714,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829714
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.0193733324207245,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.0193733324207245
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836882,
"mc2": 0.51488245253393,
"mc2_stderr": 0.015188854393420268
},
"harness|winogrande|5": {
"acc": 0.8200473559589582,
"acc_stderr": 0.01079646868806868
},
"harness|gsm8k|5": {
"acc": 0.5056861258529188,
"acc_stderr": 0.013771594106283036
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_dddsaty__SOLAR_Merge_Adapter_DPO_Orca | [
"region:us"
] | 2024-02-05T08:50:36+00:00 | {"pretty_name": "Evaluation run of dddsaty/SOLAR_Merge_Adapter_DPO_Orca", "dataset_summary": "Dataset automatically created during the evaluation run of model [dddsaty/SOLAR_Merge_Adapter_DPO_Orca](https://huggingface.co/dddsaty/SOLAR_Merge_Adapter_DPO_Orca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dddsaty__SOLAR_Merge_Adapter_DPO_Orca\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-05T08:48:15.938281](https://huggingface.co/datasets/open-llm-leaderboard/details_dddsaty__SOLAR_Merge_Adapter_DPO_Orca/blob/main/results_2024-02-05T08-48-15.938281.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6327439375185642,\n \"acc_stderr\": 0.032443250608216324,\n \"acc_norm\": 0.6355200172658205,\n \"acc_norm_stderr\": 0.03310341022715256,\n \"mc1\": 0.36107711138310894,\n \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.51488245253393,\n \"mc2_stderr\": 0.015188854393420268\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6109215017064846,\n \"acc_stderr\": 0.014247309976045607,\n \"acc_norm\": 0.6390784982935154,\n \"acc_norm_stderr\": 0.014034761386175452\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6499701254730134,\n \"acc_stderr\": 0.004760041843651493,\n \"acc_norm\": 0.8458474407488548,\n \"acc_norm_stderr\": 0.0036035695286784114\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.03745554791462456,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03745554791462456\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099522,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099522\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04082482904638628,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04082482904638628\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944423,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944423\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7322580645161291,\n \"acc_stderr\": 0.02518900666021238,\n \"acc_norm\": 0.7322580645161291,\n \"acc_norm_stderr\": 0.02518900666021238\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.03486731727419872,\n \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.03486731727419872\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8282828282828283,\n \"acc_stderr\": 0.0268697161874299,\n \"acc_norm\": 0.8282828282828283,\n \"acc_norm_stderr\": 0.0268697161874299\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593566,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593566\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.02475600038213095,\n \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.02475600038213095\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476664,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476664\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135353,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135353\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659809,\n \"acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659809\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8165137614678899,\n \"acc_stderr\": 0.016595259710399293,\n \"acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.016595259710399293\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.033812000056435254,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.033812000056435254\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8725490196078431,\n \"acc_stderr\": 0.02340553048084631,\n \"acc_norm\": 0.8725490196078431,\n \"acc_norm_stderr\": 0.02340553048084631\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n \"acc_stderr\": 0.030500283176545847,\n \"acc_norm\": 0.7085201793721974,\n \"acc_norm_stderr\": 0.030500283176545847\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281372,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281372\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n \"acc_stderr\": 0.013740797258579828,\n \"acc_norm\": 0.8199233716475096,\n \"acc_norm_stderr\": 0.013740797258579828\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069716,\n \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069716\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35195530726256985,\n \"acc_stderr\": 0.01597266852368907,\n \"acc_norm\": 0.35195530726256985,\n \"acc_norm_stderr\": 0.01597266852368907\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.02456922360046085,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.02456922360046085\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829714,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829714\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6437908496732027,\n \"acc_stderr\": 0.0193733324207245,\n \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.0193733324207245\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36107711138310894,\n \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.51488245253393,\n \"mc2_stderr\": 0.015188854393420268\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8200473559589582,\n \"acc_stderr\": 0.01079646868806868\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5056861258529188,\n \"acc_stderr\": 0.013771594106283036\n }\n}\n```", "repo_url": "https://huggingface.co/dddsaty/SOLAR_Merge_Adapter_DPO_Orca", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|arc:challenge|25_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|gsm8k|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hellaswag|10_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T08-48-15.938281.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["**/details_harness|winogrande|5_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-05T08-48-15.938281.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_05T08_48_15.938281", "path": ["results_2024-02-05T08-48-15.938281.parquet"]}, {"split": "latest", "path": ["results_2024-02-05T08-48-15.938281.parquet"]}]}]} | 2024-02-05T08:51:01+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of dddsaty/SOLAR_Merge_Adapter_DPO_Orca
Dataset automatically created during the evaluation run of model dddsaty/SOLAR_Merge_Adapter_DPO_Orca on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-05T08:48:15.938281(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of dddsaty/SOLAR_Merge_Adapter_DPO_Orca\n\n\n\nDataset automatically created during the evaluation run of model dddsaty/SOLAR_Merge_Adapter_DPO_Orca on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T08:48:15.938281(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of dddsaty/SOLAR_Merge_Adapter_DPO_Orca\n\n\n\nDataset automatically created during the evaluation run of model dddsaty/SOLAR_Merge_Adapter_DPO_Orca on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T08:48:15.938281(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
52096d395c0c340bc6e015daf291472bdc9238bd |
# Dataset Card for Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-recovered
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-recovered](https://huggingface.co/sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-recovered) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO-recovered",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-05T08:48:44.538072](https://huggingface.co/datasets/open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO-recovered/blob/main/results_2024-02-05T08-48-44.538072.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6395110711097536,
"acc_stderr": 0.0322486599464856,
"acc_norm": 0.6419597596321146,
"acc_norm_stderr": 0.03288981571306671,
"mc1": 0.3598531211750306,
"mc1_stderr": 0.016801860466677157,
"mc2": 0.5291306939423928,
"mc2_stderr": 0.015285941575450697
},
"harness|arc:challenge|25": {
"acc": 0.613481228668942,
"acc_stderr": 0.014230084761910471,
"acc_norm": 0.6527303754266212,
"acc_norm_stderr": 0.013913034529620453
},
"harness|hellaswag|10": {
"acc": 0.6564429396534555,
"acc_stderr": 0.0047392481181180125,
"acc_norm": 0.8462457677753435,
"acc_norm_stderr": 0.003599758043546812
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.02854479331905533,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.02854479331905533
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187208,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187208
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815632,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815632
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.04742762361243011,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.04742762361243011
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066302,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3128491620111732,
"acc_stderr": 0.015506892594647262,
"acc_norm": 0.3128491620111732,
"acc_norm_stderr": 0.015506892594647262
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462937,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462937
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162662,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162662
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.027979823538744543,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.027979823538744543
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786845,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786845
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3598531211750306,
"mc1_stderr": 0.016801860466677157,
"mc2": 0.5291306939423928,
"mc2_stderr": 0.015285941575450697
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
},
"harness|gsm8k|5": {
"acc": 0.5830174374526156,
"acc_stderr": 0.013581320997216591
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO-recovered | [
"region:us"
] | 2024-02-05T08:51:05+00:00 | {"pretty_name": "Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-recovered", "dataset_summary": "Dataset automatically created during the evaluation run of model [sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-recovered](https://huggingface.co/sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-recovered) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO-recovered\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-05T08:48:44.538072](https://huggingface.co/datasets/open-llm-leaderboard/details_sonthenguyen__OpenHermes-2.5-Mistral-7B-mt-bench-DPO-recovered/blob/main/results_2024-02-05T08-48-44.538072.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6395110711097536,\n \"acc_stderr\": 0.0322486599464856,\n \"acc_norm\": 0.6419597596321146,\n \"acc_norm_stderr\": 0.03288981571306671,\n \"mc1\": 0.3598531211750306,\n \"mc1_stderr\": 0.016801860466677157,\n \"mc2\": 0.5291306939423928,\n \"mc2_stderr\": 0.015285941575450697\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.613481228668942,\n \"acc_stderr\": 0.014230084761910471,\n \"acc_norm\": 0.6527303754266212,\n \"acc_norm_stderr\": 0.013913034529620453\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6564429396534555,\n \"acc_stderr\": 0.0047392481181180125,\n \"acc_norm\": 0.8462457677753435,\n \"acc_norm_stderr\": 0.003599758043546812\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.02854479331905533,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.02854479331905533\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187208,\n \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187208\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815632,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815632\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.04742762361243011,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.04742762361243011\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066302,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066302\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3128491620111732,\n \"acc_stderr\": 0.015506892594647262,\n \"acc_norm\": 0.3128491620111732,\n \"acc_norm_stderr\": 0.015506892594647262\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462937,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462937\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162662,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162662\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.027979823538744543,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.027979823538744543\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3598531211750306,\n \"mc1_stderr\": 0.016801860466677157,\n \"mc2\": 0.5291306939423928,\n \"mc2_stderr\": 0.015285941575450697\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5830174374526156,\n \"acc_stderr\": 0.013581320997216591\n }\n}\n```", "repo_url": "https://huggingface.co/sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-recovered", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|arc:challenge|25_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|gsm8k|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hellaswag|10_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-05T08-48-44.538072.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["**/details_harness|winogrande|5_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-05T08-48-44.538072.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_05T08_48_44.538072", "path": ["results_2024-02-05T08-48-44.538072.parquet"]}, {"split": "latest", "path": ["results_2024-02-05T08-48-44.538072.parquet"]}]}]} | 2024-02-05T08:51:31+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-recovered
Dataset automatically created during the evaluation run of model sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-recovered on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-05T08:48:44.538072(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-recovered\n\n\n\nDataset automatically created during the evaluation run of model sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-recovered on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T08:48:44.538072(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-recovered\n\n\n\nDataset automatically created during the evaluation run of model sonthenguyen/OpenHermes-2.5-Mistral-7B-mt-bench-DPO-recovered on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-05T08:48:44.538072(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.